var/home/core/zuul-output/0000755000175000017500000000000015070662274014536 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015070672423015477 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log0000644000000000000000004537104115070672415017711 0ustar rootrootOct 06 06:45:14 crc systemd[1]: Starting Kubernetes Kubelet... Oct 06 06:45:14 crc restorecon[4672]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:14 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 06:45:15 crc restorecon[4672]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 06:45:15 crc restorecon[4672]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 06 06:45:15 crc kubenswrapper[4845]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 06:45:15 crc kubenswrapper[4845]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 06 06:45:15 crc kubenswrapper[4845]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 06:45:15 crc kubenswrapper[4845]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 06:45:15 crc kubenswrapper[4845]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 06 06:45:15 crc kubenswrapper[4845]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.949738 4845 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954501 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954516 4845 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954521 4845 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954527 4845 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954532 4845 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954538 4845 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954544 4845 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954549 4845 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954553 4845 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954557 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954561 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954565 4845 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954569 4845 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954574 4845 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954578 4845 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954582 4845 feature_gate.go:330] unrecognized feature gate: Example Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954585 4845 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954589 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954593 4845 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954597 4845 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954601 4845 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954606 4845 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954609 4845 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954614 4845 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954617 4845 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954621 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954625 4845 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954629 4845 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954633 4845 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954638 4845 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954643 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954647 4845 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954650 4845 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954654 4845 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954658 4845 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954662 4845 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954665 4845 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954669 4845 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954672 4845 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954676 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954681 4845 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954686 4845 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954690 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954694 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954699 4845 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954703 4845 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954709 4845 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954713 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954718 4845 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954722 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954726 4845 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954732 4845 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954736 4845 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954741 4845 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954745 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954749 4845 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954753 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954756 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954761 4845 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954765 4845 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954770 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954774 4845 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954779 4845 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954782 4845 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954786 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954790 4845 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954795 4845 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954805 4845 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954812 4845 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954817 4845 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.954823 4845 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955778 4845 flags.go:64] FLAG: --address="0.0.0.0" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955793 4845 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955804 4845 flags.go:64] FLAG: --anonymous-auth="true" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955811 4845 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955817 4845 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955823 4845 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955830 4845 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955838 4845 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955843 4845 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955849 4845 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955854 4845 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955860 4845 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955864 4845 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955869 4845 flags.go:64] FLAG: --cgroup-root="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955874 4845 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955879 4845 flags.go:64] FLAG: --client-ca-file="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955883 4845 flags.go:64] FLAG: --cloud-config="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955888 4845 flags.go:64] FLAG: --cloud-provider="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955892 4845 flags.go:64] FLAG: --cluster-dns="[]" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955898 4845 flags.go:64] FLAG: --cluster-domain="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955903 4845 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955907 4845 flags.go:64] FLAG: --config-dir="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955912 4845 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955916 4845 flags.go:64] FLAG: --container-log-max-files="5" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955929 4845 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955934 4845 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955939 4845 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955943 4845 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955948 4845 flags.go:64] FLAG: --contention-profiling="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955952 4845 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955957 4845 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955962 4845 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955966 4845 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955974 4845 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955978 4845 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955983 4845 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955988 4845 flags.go:64] FLAG: --enable-load-reader="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955993 4845 flags.go:64] FLAG: --enable-server="true" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.955997 4845 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956004 4845 flags.go:64] FLAG: --event-burst="100" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956009 4845 flags.go:64] FLAG: --event-qps="50" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956014 4845 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956019 4845 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956024 4845 flags.go:64] FLAG: --eviction-hard="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956034 4845 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956038 4845 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956043 4845 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956048 4845 flags.go:64] FLAG: --eviction-soft="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956053 4845 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956057 4845 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956062 4845 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956067 4845 flags.go:64] FLAG: --experimental-mounter-path="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956072 4845 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956076 4845 flags.go:64] FLAG: --fail-swap-on="true" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956081 4845 flags.go:64] FLAG: --feature-gates="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956087 4845 flags.go:64] FLAG: --file-check-frequency="20s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956092 4845 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956097 4845 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956102 4845 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956107 4845 flags.go:64] FLAG: --healthz-port="10248" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956111 4845 flags.go:64] FLAG: --help="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956116 4845 flags.go:64] FLAG: --hostname-override="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956121 4845 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956125 4845 flags.go:64] FLAG: --http-check-frequency="20s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956130 4845 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956135 4845 flags.go:64] FLAG: --image-credential-provider-config="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956139 4845 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956143 4845 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956147 4845 flags.go:64] FLAG: --image-service-endpoint="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956151 4845 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956156 4845 flags.go:64] FLAG: --kube-api-burst="100" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956160 4845 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956165 4845 flags.go:64] FLAG: --kube-api-qps="50" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956171 4845 flags.go:64] FLAG: --kube-reserved="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956176 4845 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956181 4845 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956186 4845 flags.go:64] FLAG: --kubelet-cgroups="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956191 4845 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956195 4845 flags.go:64] FLAG: --lock-file="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956203 4845 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956208 4845 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956213 4845 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956220 4845 flags.go:64] FLAG: --log-json-split-stream="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956225 4845 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956229 4845 flags.go:64] FLAG: --log-text-split-stream="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956233 4845 flags.go:64] FLAG: --logging-format="text" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956238 4845 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956242 4845 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956247 4845 flags.go:64] FLAG: --manifest-url="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956251 4845 flags.go:64] FLAG: --manifest-url-header="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956257 4845 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956262 4845 flags.go:64] FLAG: --max-open-files="1000000" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956268 4845 flags.go:64] FLAG: --max-pods="110" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956272 4845 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956277 4845 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956281 4845 flags.go:64] FLAG: --memory-manager-policy="None" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956287 4845 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956292 4845 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956296 4845 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956301 4845 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956312 4845 flags.go:64] FLAG: --node-status-max-images="50" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956316 4845 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956320 4845 flags.go:64] FLAG: --oom-score-adj="-999" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956325 4845 flags.go:64] FLAG: --pod-cidr="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956329 4845 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956336 4845 flags.go:64] FLAG: --pod-manifest-path="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956341 4845 flags.go:64] FLAG: --pod-max-pids="-1" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956346 4845 flags.go:64] FLAG: --pods-per-core="0" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956350 4845 flags.go:64] FLAG: --port="10250" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956355 4845 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956359 4845 flags.go:64] FLAG: --provider-id="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956363 4845 flags.go:64] FLAG: --qos-reserved="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956367 4845 flags.go:64] FLAG: --read-only-port="10255" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956385 4845 flags.go:64] FLAG: --register-node="true" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956390 4845 flags.go:64] FLAG: --register-schedulable="true" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956395 4845 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956404 4845 flags.go:64] FLAG: --registry-burst="10" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956409 4845 flags.go:64] FLAG: --registry-qps="5" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956412 4845 flags.go:64] FLAG: --reserved-cpus="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956417 4845 flags.go:64] FLAG: --reserved-memory="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956423 4845 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956427 4845 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956431 4845 flags.go:64] FLAG: --rotate-certificates="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956435 4845 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956439 4845 flags.go:64] FLAG: --runonce="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956443 4845 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956447 4845 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956452 4845 flags.go:64] FLAG: --seccomp-default="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956456 4845 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956461 4845 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956465 4845 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956469 4845 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956474 4845 flags.go:64] FLAG: --storage-driver-password="root" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956478 4845 flags.go:64] FLAG: --storage-driver-secure="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956483 4845 flags.go:64] FLAG: --storage-driver-table="stats" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956487 4845 flags.go:64] FLAG: --storage-driver-user="root" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956492 4845 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956496 4845 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956501 4845 flags.go:64] FLAG: --system-cgroups="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956505 4845 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956513 4845 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956517 4845 flags.go:64] FLAG: --tls-cert-file="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956521 4845 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956526 4845 flags.go:64] FLAG: --tls-min-version="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956530 4845 flags.go:64] FLAG: --tls-private-key-file="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956534 4845 flags.go:64] FLAG: --topology-manager-policy="none" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956538 4845 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956542 4845 flags.go:64] FLAG: --topology-manager-scope="container" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956546 4845 flags.go:64] FLAG: --v="2" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956552 4845 flags.go:64] FLAG: --version="false" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956558 4845 flags.go:64] FLAG: --vmodule="" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956564 4845 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.956569 4845 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956708 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956714 4845 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956720 4845 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956725 4845 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956729 4845 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956733 4845 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956738 4845 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956742 4845 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956745 4845 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956749 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956753 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956757 4845 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956762 4845 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956766 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956770 4845 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956774 4845 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956778 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956783 4845 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956787 4845 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956792 4845 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956795 4845 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956799 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956804 4845 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956809 4845 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956814 4845 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956818 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956822 4845 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956826 4845 feature_gate.go:330] unrecognized feature gate: Example Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956830 4845 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956834 4845 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956838 4845 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956842 4845 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956845 4845 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956849 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956854 4845 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956858 4845 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956862 4845 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956865 4845 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956869 4845 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956873 4845 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956876 4845 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956880 4845 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956884 4845 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956887 4845 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956891 4845 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956894 4845 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956898 4845 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956901 4845 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956905 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956909 4845 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956913 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956916 4845 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956920 4845 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956924 4845 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956928 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956932 4845 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956935 4845 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956939 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956943 4845 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956946 4845 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956950 4845 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956954 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956957 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956962 4845 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956966 4845 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956970 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956973 4845 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956977 4845 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956981 4845 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956985 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.956997 4845 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.957009 4845 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.967543 4845 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.967599 4845 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967774 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967799 4845 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967813 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967825 4845 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967835 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967844 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967852 4845 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967859 4845 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967867 4845 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967875 4845 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967883 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967893 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967901 4845 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967909 4845 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967917 4845 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967925 4845 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967933 4845 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967941 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967948 4845 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967956 4845 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967964 4845 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967972 4845 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967980 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.967990 4845 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968000 4845 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968011 4845 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968021 4845 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968031 4845 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968041 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968051 4845 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968061 4845 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968072 4845 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968107 4845 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968117 4845 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968135 4845 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968148 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968159 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968172 4845 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968183 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968193 4845 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968202 4845 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968211 4845 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968219 4845 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968227 4845 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968235 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968243 4845 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968250 4845 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968258 4845 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968266 4845 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968274 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968282 4845 feature_gate.go:330] unrecognized feature gate: Example Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968290 4845 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968298 4845 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968305 4845 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968314 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968325 4845 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968335 4845 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968343 4845 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968351 4845 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968360 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968367 4845 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968406 4845 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968414 4845 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968423 4845 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968431 4845 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968440 4845 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968448 4845 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968455 4845 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968464 4845 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968472 4845 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968481 4845 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.968496 4845 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968782 4845 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968800 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968809 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968818 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968829 4845 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968839 4845 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968848 4845 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968856 4845 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968866 4845 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968900 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968909 4845 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968919 4845 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968929 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968938 4845 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968946 4845 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968954 4845 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968963 4845 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968971 4845 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968979 4845 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968986 4845 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.968994 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969002 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969009 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969017 4845 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969025 4845 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969032 4845 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969040 4845 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969048 4845 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969056 4845 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969063 4845 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969073 4845 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969083 4845 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969091 4845 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969099 4845 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969122 4845 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969131 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969139 4845 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969147 4845 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969154 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969162 4845 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969170 4845 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969177 4845 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969185 4845 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969192 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969201 4845 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969209 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969216 4845 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969224 4845 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969232 4845 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969240 4845 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969248 4845 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969258 4845 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969268 4845 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969278 4845 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969287 4845 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969295 4845 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969306 4845 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969315 4845 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969324 4845 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969332 4845 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969339 4845 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969348 4845 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969355 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969364 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969398 4845 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969407 4845 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969415 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969423 4845 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969431 4845 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969438 4845 feature_gate.go:330] unrecognized feature gate: Example Oct 06 06:45:15 crc kubenswrapper[4845]: W1006 06:45:15.969448 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.969462 4845 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.969822 4845 server.go:940] "Client rotation is on, will bootstrap in background" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.977797 4845 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.977962 4845 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.979892 4845 server.go:997] "Starting client certificate rotation" Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.979946 4845 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.980170 4845 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-22 14:31:23.242680683 +0000 UTC Oct 06 06:45:15 crc kubenswrapper[4845]: I1006 06:45:15.980312 4845 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1855h46m7.26237166s for next certificate rotation Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.010337 4845 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.013981 4845 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.047836 4845 log.go:25] "Validated CRI v1 runtime API" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.083588 4845 log.go:25] "Validated CRI v1 image API" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.087049 4845 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.093099 4845 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-06-06-41-13-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.093136 4845 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.113412 4845 manager.go:217] Machine: {Timestamp:2025-10-06 06:45:16.110462484 +0000 UTC m=+0.625203512 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985 BootID:0eab3b1f-e032-4e17-acfe-a00e1d48a232 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:90:b8:9f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:90:b8:9f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e7:aa:58 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:fc:30:8d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:63:0b:7f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:fd:ca:f6 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e6:1c:c8:8b:d3:71 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:0e:da:fa:12:04:cf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.113728 4845 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.113975 4845 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.116197 4845 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.116438 4845 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.116488 4845 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.116773 4845 topology_manager.go:138] "Creating topology manager with none policy" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.116786 4845 container_manager_linux.go:303] "Creating device plugin manager" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.117351 4845 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.117407 4845 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.117669 4845 state_mem.go:36] "Initialized new in-memory state store" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.118243 4845 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.121510 4845 kubelet.go:418] "Attempting to sync node with API server" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.121540 4845 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.121558 4845 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.121573 4845 kubelet.go:324] "Adding apiserver pod source" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.121597 4845 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.128033 4845 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.129527 4845 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 06 06:45:16 crc kubenswrapper[4845]: W1006 06:45:16.129750 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.129973 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 06 06:45:16 crc kubenswrapper[4845]: W1006 06:45:16.130722 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.131158 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.132041 4845 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.133895 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.133940 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.133956 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.133970 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.133992 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.134008 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.134024 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.134052 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.134072 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.134091 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.134135 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.134152 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.136675 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.137766 4845 server.go:1280] "Started kubelet" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.139046 4845 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.139293 4845 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.139307 4845 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 06 06:45:16 crc systemd[1]: Started Kubernetes Kubelet. Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.143064 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.143134 4845 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.140174 4845 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.143650 4845 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.143689 4845 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.143960 4845 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.144059 4845 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.144162 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 12:38:25.393820794 +0000 UTC Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.144230 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1397h53m9.249596954s for next certificate rotation Oct 06 06:45:16 crc kubenswrapper[4845]: W1006 06:45:16.145323 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.145460 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.145836 4845 factory.go:55] Registering systemd factory Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.145863 4845 factory.go:221] Registration of the systemd container factory successfully Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.146527 4845 factory.go:153] Registering CRI-O factory Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.146564 4845 factory.go:221] Registration of the crio container factory successfully Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.146643 4845 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.146673 4845 factory.go:103] Registering Raw factory Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.146690 4845 manager.go:1196] Started watching for new ooms in manager Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.145337 4845 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.230:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186bd3f216e0f1b0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-06 06:45:16.137697712 +0000 UTC m=+0.652438790,LastTimestamp:2025-10-06 06:45:16.137697712 +0000 UTC m=+0.652438790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.153889 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="200ms" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.154244 4845 server.go:460] "Adding debug handlers to kubelet server" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.154407 4845 manager.go:319] Starting recovery of all containers Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165116 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165221 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165248 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165273 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165299 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165326 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165353 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165411 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165445 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165469 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165493 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165520 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165550 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165585 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165610 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165635 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165667 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165693 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165784 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165873 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165905 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165933 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165962 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.165990 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166050 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166140 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166178 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166209 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166240 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166266 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166291 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166318 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166348 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166407 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166437 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166464 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166490 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166525 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166553 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166583 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166611 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166642 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166670 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166698 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166726 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166758 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166791 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166816 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166847 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166880 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.166942 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167023 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167062 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167091 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167119 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167148 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167181 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167211 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167242 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167269 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167294 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167323 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167351 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.167412 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.169524 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.169589 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.169645 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.169675 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.169703 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.169744 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.169772 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.169813 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.169841 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.169868 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.169908 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.174702 4845 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175119 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175147 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175164 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175193 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175209 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175233 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175250 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175266 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175290 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175305 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175319 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175337 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175367 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175400 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175414 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175427 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175444 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175458 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175475 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175486 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175501 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175520 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175532 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175551 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175562 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175578 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175597 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175610 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175654 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175676 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175710 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175734 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175753 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175779 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175800 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175815 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175836 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175858 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175871 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175901 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175917 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175930 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175948 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175961 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175980 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.175992 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176007 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176028 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176046 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176063 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176088 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176101 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176119 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176134 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176151 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176163 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176177 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176193 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176205 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176216 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176235 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176254 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176292 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176308 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176520 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176534 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176551 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176606 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176619 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176637 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176650 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176662 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176680 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176693 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176707 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176720 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176734 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176751 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176765 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176801 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176815 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176830 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176843 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176856 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176870 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176891 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176902 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176919 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176932 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176966 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176979 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.176993 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177008 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177021 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177035 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177053 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177068 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177087 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177105 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177124 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177143 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177163 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177185 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177201 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177218 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177239 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177256 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177279 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177295 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177313 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177332 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177348 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177582 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177616 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177657 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177690 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177794 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177818 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177843 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177873 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177948 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.177972 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.178000 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.178022 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.178051 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.178120 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.178151 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.178179 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.178202 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.178226 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.178247 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.178324 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.178348 4845 reconstruct.go:97] "Volume reconstruction finished" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.178396 4845 reconciler.go:26] "Reconciler: start to sync state" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.185135 4845 manager.go:324] Recovery completed Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.198275 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.200467 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.200518 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.200532 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.201498 4845 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.201520 4845 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.201561 4845 state_mem.go:36] "Initialized new in-memory state store" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.218010 4845 policy_none.go:49] "None policy: Start" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.219659 4845 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.219704 4845 state_mem.go:35] "Initializing new in-memory state store" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.221536 4845 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.224302 4845 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.224838 4845 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.224885 4845 kubelet.go:2335] "Starting kubelet main sync loop" Oct 06 06:45:16 crc kubenswrapper[4845]: W1006 06:45:16.225825 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.225993 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.226067 4845 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.244243 4845 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.298884 4845 manager.go:334] "Starting Device Plugin manager" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.298944 4845 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.298963 4845 server.go:79] "Starting device plugin registration server" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.300531 4845 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.300578 4845 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.301001 4845 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.301217 4845 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.301244 4845 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.312857 4845 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.326291 4845 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.326434 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.327483 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.327547 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.327566 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.327809 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.328032 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.328071 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.328842 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.328886 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.328902 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.329010 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.329060 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.329076 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.329107 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.329432 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.329539 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.330225 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.330266 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.330280 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.330467 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.330600 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.330654 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.331026 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.331058 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.331082 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.331416 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.331448 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.331462 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.331641 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.331679 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.331712 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.332434 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.332491 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.332513 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.332567 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.332597 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.332598 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.332612 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.332624 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.332641 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.332871 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.332926 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.334227 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.334257 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.334269 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.354931 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="400ms" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.382325 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.382391 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.382419 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.382444 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.382468 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.382500 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.382625 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.382679 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.382801 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.383258 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.383309 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.383351 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.383456 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.383495 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.383526 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.401077 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.402855 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.402924 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.402965 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.403002 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.403722 4845 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485296 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485418 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485447 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485521 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485546 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485572 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485594 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485615 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485644 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485684 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485704 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485724 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485744 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485764 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.485786 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486176 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486265 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486302 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486318 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486339 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486357 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486396 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486352 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486461 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486475 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486452 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486429 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486428 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486659 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.486742 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.604486 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.606122 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.606197 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.606232 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.606284 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.607335 4845 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.671598 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.691836 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.711484 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: W1006 06:45:16.732117 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5dbce1670b23bb9c5c2b168ba48796b5fb23f2d993e56d6fa423bfdfa6a9091a WatchSource:0}: Error finding container 5dbce1670b23bb9c5c2b168ba48796b5fb23f2d993e56d6fa423bfdfa6a9091a: Status 404 returned error can't find the container with id 5dbce1670b23bb9c5c2b168ba48796b5fb23f2d993e56d6fa423bfdfa6a9091a Oct 06 06:45:16 crc kubenswrapper[4845]: W1006 06:45:16.733915 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-8ed95a3513fc5f46eb6102560550e3c508f25384ec3201e3ffbba7bf67c2ae59 WatchSource:0}: Error finding container 8ed95a3513fc5f46eb6102560550e3c508f25384ec3201e3ffbba7bf67c2ae59: Status 404 returned error can't find the container with id 8ed95a3513fc5f46eb6102560550e3c508f25384ec3201e3ffbba7bf67c2ae59 Oct 06 06:45:16 crc kubenswrapper[4845]: W1006 06:45:16.741256 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-8329869845a40a401d2cee599aba4063cd7c0bdbfa9a4eae17f28c0f1e6a79d2 WatchSource:0}: Error finding container 8329869845a40a401d2cee599aba4063cd7c0bdbfa9a4eae17f28c0f1e6a79d2: Status 404 returned error can't find the container with id 8329869845a40a401d2cee599aba4063cd7c0bdbfa9a4eae17f28c0f1e6a79d2 Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.744878 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: E1006 06:45:16.755980 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="800ms" Oct 06 06:45:16 crc kubenswrapper[4845]: I1006 06:45:16.760187 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 06:45:16 crc kubenswrapper[4845]: W1006 06:45:16.770325 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-72351bfea236fc9770751f525c8374997e6b0c088e75b90e5f9fac6dc30637c3 WatchSource:0}: Error finding container 72351bfea236fc9770751f525c8374997e6b0c088e75b90e5f9fac6dc30637c3: Status 404 returned error can't find the container with id 72351bfea236fc9770751f525c8374997e6b0c088e75b90e5f9fac6dc30637c3 Oct 06 06:45:16 crc kubenswrapper[4845]: W1006 06:45:16.797753 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-c437ffab19eac0a1e0b17266244cab711ccf80a7c1e9978babd5744cc502f9bc WatchSource:0}: Error finding container c437ffab19eac0a1e0b17266244cab711ccf80a7c1e9978babd5744cc502f9bc: Status 404 returned error can't find the container with id c437ffab19eac0a1e0b17266244cab711ccf80a7c1e9978babd5744cc502f9bc Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.007761 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.010161 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.010215 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.010231 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.010279 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 06:45:17 crc kubenswrapper[4845]: E1006 06:45:17.011025 4845 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Oct 06 06:45:17 crc kubenswrapper[4845]: W1006 06:45:17.105004 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:17 crc kubenswrapper[4845]: E1006 06:45:17.105151 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.140184 4845 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.230741 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5dbce1670b23bb9c5c2b168ba48796b5fb23f2d993e56d6fa423bfdfa6a9091a"} Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.232653 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8ed95a3513fc5f46eb6102560550e3c508f25384ec3201e3ffbba7bf67c2ae59"} Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.233654 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c437ffab19eac0a1e0b17266244cab711ccf80a7c1e9978babd5744cc502f9bc"} Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.234456 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"72351bfea236fc9770751f525c8374997e6b0c088e75b90e5f9fac6dc30637c3"} Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.235211 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8329869845a40a401d2cee599aba4063cd7c0bdbfa9a4eae17f28c0f1e6a79d2"} Oct 06 06:45:17 crc kubenswrapper[4845]: W1006 06:45:17.413652 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:17 crc kubenswrapper[4845]: E1006 06:45:17.413751 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 06 06:45:17 crc kubenswrapper[4845]: W1006 06:45:17.464193 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:17 crc kubenswrapper[4845]: E1006 06:45:17.464290 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 06 06:45:17 crc kubenswrapper[4845]: E1006 06:45:17.557784 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="1.6s" Oct 06 06:45:17 crc kubenswrapper[4845]: W1006 06:45:17.673970 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:17 crc kubenswrapper[4845]: E1006 06:45:17.674068 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.811216 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.812809 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.812883 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.812944 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:17 crc kubenswrapper[4845]: I1006 06:45:17.812989 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 06:45:17 crc kubenswrapper[4845]: E1006 06:45:17.813671 4845 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Oct 06 06:45:18 crc kubenswrapper[4845]: E1006 06:45:18.127095 4845 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.230:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186bd3f216e0f1b0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-06 06:45:16.137697712 +0000 UTC m=+0.652438790,LastTimestamp:2025-10-06 06:45:16.137697712 +0000 UTC m=+0.652438790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.141142 4845 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.240546 4845 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486" exitCode=0 Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.240630 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486"} Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.240767 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.241962 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.241997 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.242008 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.242982 4845 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="17b8d88b2eabeee56862bc18814063c4fe514ab5c10af7d6875ebc84bf6f519c" exitCode=0 Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.243067 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"17b8d88b2eabeee56862bc18814063c4fe514ab5c10af7d6875ebc84bf6f519c"} Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.243130 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.243488 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.244156 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.244166 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.244176 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.244198 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.244357 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.244303 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.245497 4845 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a84572ce22cbea68049578cfb55be5a69871367cd85827279d8f2bf54c46cac8" exitCode=0 Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.245565 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a84572ce22cbea68049578cfb55be5a69871367cd85827279d8f2bf54c46cac8"} Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.245643 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.246740 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.246769 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.246783 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.249038 4845 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872" exitCode=0 Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.249099 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872"} Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.249218 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.249967 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.249995 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.250008 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.251834 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e"} Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.251894 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878"} Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.251908 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27"} Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.251915 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.251918 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309"} Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.253025 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.253088 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:18 crc kubenswrapper[4845]: I1006 06:45:18.253108 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:18 crc kubenswrapper[4845]: W1006 06:45:18.967506 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:18 crc kubenswrapper[4845]: E1006 06:45:18.967621 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.077422 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.140085 4845 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:19 crc kubenswrapper[4845]: E1006 06:45:19.159069 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="3.2s" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.263783 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a"} Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.263870 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581"} Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.263892 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e"} Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.263905 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff"} Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.266251 4845 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0b5b6aef6abd593ca611937c57404bb6435d3d20fb9f91791717908684184044" exitCode=0 Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.266357 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0b5b6aef6abd593ca611937c57404bb6435d3d20fb9f91791717908684184044"} Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.266419 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.267280 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.267315 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.267326 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.269447 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0e5faf0556fb62654c41b0faf33af8e5c2600f152e9d1d56503350facfb81f8a"} Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.270781 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.272977 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.273055 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.273072 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.276040 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.276474 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.276725 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a6e270831b0c56a937ba5cd52f2367bce7b5e8a2837c035aa4e112439829cba2"} Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.276751 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"03b87b0a15c19a70cdb61338f704315d219bd382c5ced04b22851ef71a71464f"} Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.276762 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0ec9a2dc2d359f53f2165220af84f6ee28bfef0db39b86b12ebf600727e32e9d"} Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.277243 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.277265 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.277275 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.278113 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.279115 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.279638 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:19 crc kubenswrapper[4845]: W1006 06:45:19.369516 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Oct 06 06:45:19 crc kubenswrapper[4845]: E1006 06:45:19.369608 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.414056 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.415473 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.415519 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.415531 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:19 crc kubenswrapper[4845]: I1006 06:45:19.415559 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 06:45:19 crc kubenswrapper[4845]: E1006 06:45:19.416078 4845 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.268541 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.282318 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.289152 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"17f50cd86d2d127219a031cf1cda5eb9c0cddbb0cdf563cd7adbb9a43b3809bc"} Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.289301 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.290553 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.290586 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.290601 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.292016 4845 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d75e6beec9b9b3f460928c12de268090c9d31eee03792da84924c065f22cadc4" exitCode=0 Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.292100 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.292122 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.292137 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d75e6beec9b9b3f460928c12de268090c9d31eee03792da84924c065f22cadc4"} Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.292130 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.292170 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.292226 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.292865 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.292892 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.292904 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.293576 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.293604 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.293618 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.293629 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.293651 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.293669 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.293681 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.293655 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:20 crc kubenswrapper[4845]: I1006 06:45:20.293710 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.300744 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"33d6fce2c41d0b6fd79d5f2f776fd69ad6e9412a8583dc91ca41d59043f4df6a"} Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.300818 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.300832 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"acff58e4ef4b199dad0a25e080bb3653c91268fe6ace215dd5ee509ad7c0d3fe"} Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.300859 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"af8fcaccec0d295c10c10001c0beb80b0fb1575657eb608315c6c2f1270e9647"} Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.300873 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5a42a76eb1df39eb0bace80285d57edb4cff80c44bd18168b2aaabc4a0127764"} Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.300886 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.300825 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.300941 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.302003 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.302029 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.302041 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.302201 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.302297 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:21 crc kubenswrapper[4845]: I1006 06:45:21.302319 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.078069 4845 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.078186 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.313264 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c3c9edda1b9378ab04ef63b8a1eef7adbc457863c0ecb89ac55490eeede6f56b"} Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.313603 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.315305 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.315416 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.315448 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.602498 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.602744 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.602826 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.604811 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.604891 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.604912 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.616315 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.618282 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.618345 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.618365 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:22 crc kubenswrapper[4845]: I1006 06:45:22.618446 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 06:45:23 crc kubenswrapper[4845]: I1006 06:45:23.316815 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:23 crc kubenswrapper[4845]: I1006 06:45:23.318505 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:23 crc kubenswrapper[4845]: I1006 06:45:23.318583 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:23 crc kubenswrapper[4845]: I1006 06:45:23.318604 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:23 crc kubenswrapper[4845]: I1006 06:45:23.474487 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.249072 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.249293 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.250989 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.251080 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.251098 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.319647 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.321449 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.321502 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.321524 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.718721 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.946924 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.947248 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.949563 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.949640 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:24 crc kubenswrapper[4845]: I1006 06:45:24.949661 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:25 crc kubenswrapper[4845]: I1006 06:45:25.136515 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:25 crc kubenswrapper[4845]: I1006 06:45:25.323162 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:25 crc kubenswrapper[4845]: I1006 06:45:25.323162 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:25 crc kubenswrapper[4845]: I1006 06:45:25.324893 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:25 crc kubenswrapper[4845]: I1006 06:45:25.324955 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:25 crc kubenswrapper[4845]: I1006 06:45:25.324983 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:25 crc kubenswrapper[4845]: I1006 06:45:25.325231 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:25 crc kubenswrapper[4845]: I1006 06:45:25.325302 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:25 crc kubenswrapper[4845]: I1006 06:45:25.325328 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:26 crc kubenswrapper[4845]: I1006 06:45:26.196036 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 06:45:26 crc kubenswrapper[4845]: I1006 06:45:26.196425 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:26 crc kubenswrapper[4845]: I1006 06:45:26.198287 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:26 crc kubenswrapper[4845]: I1006 06:45:26.198338 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:26 crc kubenswrapper[4845]: I1006 06:45:26.198357 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:26 crc kubenswrapper[4845]: E1006 06:45:26.312982 4845 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 06:45:26 crc kubenswrapper[4845]: I1006 06:45:26.640582 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:26 crc kubenswrapper[4845]: I1006 06:45:26.640934 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:26 crc kubenswrapper[4845]: I1006 06:45:26.643117 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:26 crc kubenswrapper[4845]: I1006 06:45:26.643183 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:26 crc kubenswrapper[4845]: I1006 06:45:26.643202 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:29 crc kubenswrapper[4845]: W1006 06:45:29.784338 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 06 06:45:29 crc kubenswrapper[4845]: I1006 06:45:29.785070 4845 trace.go:236] Trace[287201759]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 06:45:19.782) (total time: 10002ms): Oct 06 06:45:29 crc kubenswrapper[4845]: Trace[287201759]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (06:45:29.784) Oct 06 06:45:29 crc kubenswrapper[4845]: Trace[287201759]: [10.002810179s] [10.002810179s] END Oct 06 06:45:29 crc kubenswrapper[4845]: E1006 06:45:29.785104 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 06 06:45:29 crc kubenswrapper[4845]: I1006 06:45:29.818941 4845 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51444->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 06 06:45:29 crc kubenswrapper[4845]: I1006 06:45:29.819059 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51444->192.168.126.11:17697: read: connection reset by peer" Oct 06 06:45:29 crc kubenswrapper[4845]: W1006 06:45:29.840182 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 06 06:45:29 crc kubenswrapper[4845]: I1006 06:45:29.840295 4845 trace.go:236] Trace[330864664]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 06:45:19.839) (total time: 10001ms): Oct 06 06:45:29 crc kubenswrapper[4845]: Trace[330864664]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (06:45:29.840) Oct 06 06:45:29 crc kubenswrapper[4845]: Trace[330864664]: [10.00107152s] [10.00107152s] END Oct 06 06:45:29 crc kubenswrapper[4845]: E1006 06:45:29.840327 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 06 06:45:30 crc kubenswrapper[4845]: I1006 06:45:30.140830 4845 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 06 06:45:30 crc kubenswrapper[4845]: I1006 06:45:30.342940 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 06:45:30 crc kubenswrapper[4845]: I1006 06:45:30.346119 4845 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="17f50cd86d2d127219a031cf1cda5eb9c0cddbb0cdf563cd7adbb9a43b3809bc" exitCode=255 Oct 06 06:45:30 crc kubenswrapper[4845]: I1006 06:45:30.346183 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"17f50cd86d2d127219a031cf1cda5eb9c0cddbb0cdf563cd7adbb9a43b3809bc"} Oct 06 06:45:30 crc kubenswrapper[4845]: I1006 06:45:30.346498 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:30 crc kubenswrapper[4845]: I1006 06:45:30.347842 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:30 crc kubenswrapper[4845]: I1006 06:45:30.347880 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:30 crc kubenswrapper[4845]: I1006 06:45:30.347895 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:30 crc kubenswrapper[4845]: I1006 06:45:30.348560 4845 scope.go:117] "RemoveContainer" containerID="17f50cd86d2d127219a031cf1cda5eb9c0cddbb0cdf563cd7adbb9a43b3809bc" Oct 06 06:45:31 crc kubenswrapper[4845]: I1006 06:45:31.351069 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 06:45:31 crc kubenswrapper[4845]: I1006 06:45:31.352791 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb"} Oct 06 06:45:31 crc kubenswrapper[4845]: I1006 06:45:31.353016 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:31 crc kubenswrapper[4845]: I1006 06:45:31.354051 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:31 crc kubenswrapper[4845]: I1006 06:45:31.354100 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:31 crc kubenswrapper[4845]: I1006 06:45:31.354121 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:31 crc kubenswrapper[4845]: I1006 06:45:31.411053 4845 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Oct 06 06:45:31 crc kubenswrapper[4845]: I1006 06:45:31.411128 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 06 06:45:31 crc kubenswrapper[4845]: I1006 06:45:31.418440 4845 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Oct 06 06:45:31 crc kubenswrapper[4845]: I1006 06:45:31.418547 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 06 06:45:32 crc kubenswrapper[4845]: I1006 06:45:32.077728 4845 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 06:45:32 crc kubenswrapper[4845]: I1006 06:45:32.077817 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 06:45:32 crc kubenswrapper[4845]: I1006 06:45:32.607493 4845 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]log ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]etcd ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/generic-apiserver-start-informers ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/priority-and-fairness-filter ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/start-apiextensions-informers ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/start-apiextensions-controllers ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/crd-informer-synced ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/start-system-namespaces-controller ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 06 06:45:32 crc kubenswrapper[4845]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/bootstrap-controller ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/start-kube-aggregator-informers ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/apiservice-registration-controller ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/apiservice-discovery-controller ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]autoregister-completion ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/apiservice-openapi-controller ok Oct 06 06:45:32 crc kubenswrapper[4845]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 06 06:45:32 crc kubenswrapper[4845]: livez check failed Oct 06 06:45:32 crc kubenswrapper[4845]: I1006 06:45:32.607584 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.253915 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.254100 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.255369 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.255465 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.255479 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.742288 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.742544 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.743633 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.743673 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.743685 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.754418 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.858299 4845 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.947453 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.947617 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.949154 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.949202 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:34 crc kubenswrapper[4845]: I1006 06:45:34.949216 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:35 crc kubenswrapper[4845]: I1006 06:45:35.364823 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:35 crc kubenswrapper[4845]: I1006 06:45:35.365586 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:35 crc kubenswrapper[4845]: I1006 06:45:35.365630 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:35 crc kubenswrapper[4845]: I1006 06:45:35.365643 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:35 crc kubenswrapper[4845]: I1006 06:45:35.940133 4845 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.135336 4845 apiserver.go:52] "Watching apiserver" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.141393 4845 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.141793 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.142515 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.142850 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.142920 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.142967 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.142995 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.143307 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.143494 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.143635 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.143648 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.149609 4845 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.150904 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.151141 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.151258 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.151197 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.151360 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.151479 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.151198 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.151669 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.151675 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.221777 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.241800 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.252824 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.262636 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.271566 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.281260 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.291855 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.301860 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.313645 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.339680 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.357850 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.379797 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.391022 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.406586 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.409686 4845 trace.go:236] Trace[1970809614]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 06:45:23.212) (total time: 13197ms): Oct 06 06:45:36 crc kubenswrapper[4845]: Trace[1970809614]: ---"Objects listed" error: 13197ms (06:45:36.409) Oct 06 06:45:36 crc kubenswrapper[4845]: Trace[1970809614]: [13.197117031s] [13.197117031s] END Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.409723 4845 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.410914 4845 trace.go:236] Trace[539244960]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 06:45:25.197) (total time: 11213ms): Oct 06 06:45:36 crc kubenswrapper[4845]: Trace[539244960]: ---"Objects listed" error: 11212ms (06:45:36.410) Oct 06 06:45:36 crc kubenswrapper[4845]: Trace[539244960]: [11.21309894s] [11.21309894s] END Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.411090 4845 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.411976 4845 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.412062 4845 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513495 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513536 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513558 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513581 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513598 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513650 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513668 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513682 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513696 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513711 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513726 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513742 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513757 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513795 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513811 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513823 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513845 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513865 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513879 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513898 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513914 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513933 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513949 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513946 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.513969 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514012 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514019 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514029 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514045 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514062 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514081 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514099 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514116 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514131 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514148 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514165 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514181 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514211 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514229 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514249 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514266 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514283 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514305 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514319 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514335 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514350 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514364 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514395 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514409 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514425 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514440 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514452 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514457 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514491 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514512 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514531 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514553 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514576 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514600 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514617 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514622 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514634 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514650 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514668 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514685 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514701 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514718 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514734 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514751 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514750 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514768 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514865 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514889 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514909 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514912 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514926 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514935 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514956 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.514983 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515007 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515026 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515043 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515059 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515075 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515083 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515096 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515122 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515144 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515165 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515182 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515201 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515217 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515236 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515283 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515305 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515324 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515340 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515355 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515388 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515410 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515427 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515442 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515459 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515475 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515491 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515506 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515522 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515538 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515565 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515589 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515638 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515666 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515692 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515753 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515772 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515796 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515814 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515829 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515845 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515862 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515879 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515899 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515917 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515933 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515952 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515968 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516027 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516047 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516066 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516084 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516100 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516116 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516135 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516150 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516197 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516216 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516251 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516267 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516282 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516299 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516329 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516348 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516384 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516402 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516419 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516439 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516456 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516474 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516491 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516511 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516527 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516547 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516564 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516581 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516598 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516618 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516640 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516655 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516674 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516700 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516726 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516748 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516772 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516796 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516815 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516843 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516865 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516881 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516898 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516915 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516932 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516950 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516966 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516986 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517004 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517021 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517041 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517065 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517090 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517108 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517124 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517142 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517161 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517203 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517222 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517241 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517259 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517276 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517296 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517312 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517330 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517348 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517410 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517429 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517445 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517463 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517482 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517498 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517517 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517535 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517552 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517664 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517693 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517713 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517731 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517775 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517822 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517850 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517888 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517909 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517926 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517943 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517964 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517982 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518000 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518022 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518045 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518066 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518085 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518150 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518167 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518179 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518190 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518200 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518211 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518224 4845 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518265 4845 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518276 4845 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518287 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515096 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515229 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515238 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515349 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515360 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515532 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515670 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515733 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515839 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515888 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.515995 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516135 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516327 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.531869 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.532072 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.532526 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516541 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516673 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516730 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516798 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516877 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.516940 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517030 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517082 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517235 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517243 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517500 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517627 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517640 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517769 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.517785 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518000 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518160 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518222 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.532863 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518309 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518485 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518667 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518794 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.518947 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.519066 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.519165 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.522627 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.522740 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.529560 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.529633 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.529762 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.530040 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.530248 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.530312 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.531352 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.531628 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.533517 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.533801 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.534278 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.534490 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.534779 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.534810 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.535081 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.535267 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.545965 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.546284 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.546846 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.547398 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.547505 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.547658 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.547848 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.548025 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.552171 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.554136 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.554209 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.554653 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.555211 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.555421 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.555637 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.555970 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.556314 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.556531 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.556657 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.556893 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.557407 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.557460 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.558445 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.558620 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.553413 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.558928 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.558960 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.559364 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.559831 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.559925 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.560070 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.560406 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.560431 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.560476 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.560519 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.561085 4845 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.561157 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.561424 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.561449 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.561605 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.561723 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.561945 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.562158 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.562321 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.562420 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.561848 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.562552 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.562823 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.563086 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.556265 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:45:37.056237594 +0000 UTC m=+21.570978602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.563302 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.563355 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:37.063326698 +0000 UTC m=+21.578067706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.563389 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:37.063366719 +0000 UTC m=+21.578107727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.563353 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.563512 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.563610 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.563737 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.563832 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.564010 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.564135 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.564227 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.564341 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.565090 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.565420 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.565828 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.566138 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.566283 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.566359 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.566346 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.566444 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.566647 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.566751 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.566876 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.567358 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.567485 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.567517 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.567731 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.568125 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.568180 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.559360 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.568494 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.568543 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.568581 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.568802 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.569840 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.568199 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.570034 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.570118 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.570142 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.570413 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.570503 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.570882 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.571298 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.571084 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.571540 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.571728 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.572003 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.572342 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.572362 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.572516 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.572678 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.573076 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.573654 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.574195 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.574722 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.575386 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.575482 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.575503 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.575517 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.575598 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:37.075579522 +0000 UTC m=+21.590320530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.579430 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.579793 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.580095 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.581180 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.583402 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.584025 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.584241 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.584510 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.584537 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.585110 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.585217 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.585279 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.585638 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.585775 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.585955 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.586095 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.586231 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.586816 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.587405 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.587504 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.587879 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.587910 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.588552 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.590007 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.590197 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.590508 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.591491 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.591712 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.598202 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.598279 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.598845 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.598866 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.598878 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:36 crc kubenswrapper[4845]: E1006 06:45:36.598928 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:37.098910152 +0000 UTC m=+21.613651160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.601124 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.618924 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.618975 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619041 4845 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619057 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619071 4845 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619083 4845 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619095 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619106 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619119 4845 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619131 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619141 4845 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619150 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619158 4845 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619167 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619176 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619184 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619195 4845 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619204 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619212 4845 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619221 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619229 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619239 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619250 4845 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619262 4845 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619273 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619285 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619298 4845 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619310 4845 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619322 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619334 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619417 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619451 4845 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619447 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619465 4845 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619516 4845 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619518 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619529 4845 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619550 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619561 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619571 4845 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619581 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619589 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619587 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619598 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619618 4845 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619631 4845 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619643 4845 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619652 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619661 4845 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619670 4845 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619679 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619690 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619701 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619710 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619719 4845 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619730 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619740 4845 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619749 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619759 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619770 4845 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619780 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619790 4845 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619799 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619808 4845 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619818 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619830 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619841 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619850 4845 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619859 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619868 4845 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619877 4845 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619889 4845 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619901 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619910 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619922 4845 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619932 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619940 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619949 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619958 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619967 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619975 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619984 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.619993 4845 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620001 4845 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620010 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620019 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620028 4845 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620037 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620046 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620055 4845 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620064 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620074 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620084 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620095 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620103 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620112 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620123 4845 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620133 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620143 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620153 4845 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620162 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620172 4845 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620180 4845 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620189 4845 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620197 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620206 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620215 4845 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620223 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620234 4845 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620243 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620252 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620262 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620274 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620283 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620293 4845 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620302 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620311 4845 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620320 4845 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620329 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620337 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620346 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620357 4845 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620366 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620389 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620398 4845 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620407 4845 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620416 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620424 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620434 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620443 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620451 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620460 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620469 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620478 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620487 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620496 4845 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620505 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620514 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620524 4845 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620532 4845 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620544 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620553 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620564 4845 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620572 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620583 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620592 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620604 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620613 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620635 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620644 4845 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620654 4845 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620664 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620675 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620685 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620695 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620704 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620714 4845 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620722 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620731 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620741 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620751 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620761 4845 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620772 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620781 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620791 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620800 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620809 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620818 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620826 4845 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620836 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620847 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620859 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620869 4845 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620879 4845 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620890 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620898 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620907 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620916 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620925 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620933 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620945 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620957 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620970 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620980 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620989 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.620998 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.621007 4845 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.621016 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.722266 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.758714 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.767497 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 06:45:36 crc kubenswrapper[4845]: W1006 06:45:36.771776 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-631dfcfc7d22829fbc0081dd46160e6457f9419700d03afca1ea5c19e4933b15 WatchSource:0}: Error finding container 631dfcfc7d22829fbc0081dd46160e6457f9419700d03afca1ea5c19e4933b15: Status 404 returned error can't find the container with id 631dfcfc7d22829fbc0081dd46160e6457f9419700d03afca1ea5c19e4933b15 Oct 06 06:45:36 crc kubenswrapper[4845]: I1006 06:45:36.775765 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 06:45:36 crc kubenswrapper[4845]: W1006 06:45:36.797636 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-7448c4bba600285fd9f04e4183558b85b04034c18423bb1ac7c12a5fcbf17d92 WatchSource:0}: Error finding container 7448c4bba600285fd9f04e4183558b85b04034c18423bb1ac7c12a5fcbf17d92: Status 404 returned error can't find the container with id 7448c4bba600285fd9f04e4183558b85b04034c18423bb1ac7c12a5fcbf17d92 Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.126264 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.126356 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.126409 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.126431 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.126483 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.126597 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.126604 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:45:38.126555397 +0000 UTC m=+22.641296405 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.126671 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:38.126658209 +0000 UTC m=+22.641399437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.126671 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.126719 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.126693 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.126760 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.126821 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.126617 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.126735 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.126956 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:38.126929185 +0000 UTC m=+22.641670373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.126990 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:38.126973786 +0000 UTC m=+22.641714794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.127010 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:38.127001057 +0000 UTC m=+22.641742285 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.370939 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7448c4bba600285fd9f04e4183558b85b04034c18423bb1ac7c12a5fcbf17d92"} Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.373583 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d"} Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.373661 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65"} Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.373679 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3ec9656a0e332c7f58b70bdce3758988554ade7ca214aa4bcb08025c187d458e"} Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.376254 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374"} Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.376287 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"631dfcfc7d22829fbc0081dd46160e6457f9419700d03afca1ea5c19e4933b15"} Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.378569 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.379166 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.381159 4845 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb" exitCode=255 Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.381213 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb"} Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.381276 4845 scope.go:117] "RemoveContainer" containerID="17f50cd86d2d127219a031cf1cda5eb9c0cddbb0cdf563cd7adbb9a43b3809bc" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.389358 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.398106 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.407359 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.416242 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.426510 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.426559 4845 scope.go:117] "RemoveContainer" containerID="526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb" Oct 06 06:45:37 crc kubenswrapper[4845]: E1006 06:45:37.426879 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.431586 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.445845 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.457496 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.467791 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f50cd86d2d127219a031cf1cda5eb9c0cddbb0cdf563cd7adbb9a43b3809bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:29Z\\\",\\\"message\\\":\\\"W1006 06:45:19.388973 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 06:45:19.389328 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759733119 cert, and key in /tmp/serving-cert-2046128313/serving-signer.crt, /tmp/serving-cert-2046128313/serving-signer.key\\\\nI1006 06:45:19.582383 1 observer_polling.go:159] Starting file observer\\\\nW1006 06:45:19.587180 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1006 06:45:19.587535 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:19.588337 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2046128313/tls.crt::/tmp/serving-cert-2046128313/tls.key\\\\\\\"\\\\nF1006 06:45:29.814735 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.487122 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.499331 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:37Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.516054 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:37Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.530796 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:37Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.557185 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:37Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.606663 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.619387 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:37Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.636239 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:37Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.653009 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:37Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.670617 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f50cd86d2d127219a031cf1cda5eb9c0cddbb0cdf563cd7adbb9a43b3809bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:29Z\\\",\\\"message\\\":\\\"W1006 06:45:19.388973 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 06:45:19.389328 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759733119 cert, and key in /tmp/serving-cert-2046128313/serving-signer.crt, /tmp/serving-cert-2046128313/serving-signer.key\\\\nI1006 06:45:19.582383 1 observer_polling.go:159] Starting file observer\\\\nW1006 06:45:19.587180 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1006 06:45:19.587535 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:19.588337 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2046128313/tls.crt::/tmp/serving-cert-2046128313/tls.key\\\\\\\"\\\\nF1006 06:45:29.814735 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:37Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.683176 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:37Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.699000 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:37Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.709496 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:37Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.950174 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-587xc"] Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.950911 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.951725 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-tpgm6"] Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.952205 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-689qf"] Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.952389 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zpn9l"] Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.952575 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-m79r8"] Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.952909 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-689qf" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.952924 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.952990 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zpn9l" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.953019 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.953770 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.955143 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.955155 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.955197 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.955241 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.955992 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.955996 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.956113 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.956184 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.956282 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.956512 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.956687 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.956910 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.957071 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.957102 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.957229 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.957445 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.957606 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.957635 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.957701 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.957739 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.961661 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 06 06:45:37 crc kubenswrapper[4845]: I1006 06:45:37.996968 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:37Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.009442 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.022202 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034067 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034407 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovnkube-script-lib\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034469 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-var-lib-kubelet\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034494 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xhsv\" (UniqueName: \"kubernetes.io/projected/2080026c-9eee-4863-b62d-e9ce4d4525dd-kube-api-access-2xhsv\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034514 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovn-node-metrics-cert\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034531 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rwfz\" (UniqueName: \"kubernetes.io/projected/453226ed-506e-48cb-89a8-a03ca92660e0-kube-api-access-8rwfz\") pod \"node-resolver-689qf\" (UID: \"453226ed-506e-48cb-89a8-a03ca92660e0\") " pod="openshift-dns/node-resolver-689qf" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034548 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-run-ovn-kubernetes\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034589 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-cni-netd\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034642 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovnkube-config\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034661 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-cnibin\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034681 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-multus-conf-dir\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034703 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-cni-bin\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034728 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9zb5\" (UniqueName: \"kubernetes.io/projected/6936952c-09f0-48fd-8832-38c18202ae81-kube-api-access-s9zb5\") pod \"machine-config-daemon-tpgm6\" (UID: \"6936952c-09f0-48fd-8832-38c18202ae81\") " pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034750 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-run-k8s-cni-cncf-io\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034770 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/331140be-ed04-4023-b244-31f5817b8803-cni-binary-copy\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034808 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-node-log\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034827 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-hostroot\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034846 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/331140be-ed04-4023-b244-31f5817b8803-system-cni-dir\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034872 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-kubelet\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034886 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-systemd-units\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034943 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-var-lib-openvswitch\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034963 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2080026c-9eee-4863-b62d-e9ce4d4525dd-multus-daemon-config\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.034983 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-run-multus-certs\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035007 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-system-cni-dir\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035025 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/453226ed-506e-48cb-89a8-a03ca92660e0-hosts-file\") pod \"node-resolver-689qf\" (UID: \"453226ed-506e-48cb-89a8-a03ca92660e0\") " pod="openshift-dns/node-resolver-689qf" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035042 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6936952c-09f0-48fd-8832-38c18202ae81-proxy-tls\") pod \"machine-config-daemon-tpgm6\" (UID: \"6936952c-09f0-48fd-8832-38c18202ae81\") " pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035070 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6936952c-09f0-48fd-8832-38c18202ae81-rootfs\") pod \"machine-config-daemon-tpgm6\" (UID: \"6936952c-09f0-48fd-8832-38c18202ae81\") " pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035101 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-etc-kubernetes\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035147 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-slash\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035163 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-multus-socket-dir-parent\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035179 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-run-netns\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035196 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/331140be-ed04-4023-b244-31f5817b8803-os-release\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035221 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/331140be-ed04-4023-b244-31f5817b8803-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035238 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-run-netns\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035268 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-etc-openvswitch\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035288 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-systemd\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035308 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-os-release\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035327 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-log-socket\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035343 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035363 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6936952c-09f0-48fd-8832-38c18202ae81-mcd-auth-proxy-config\") pod \"machine-config-daemon-tpgm6\" (UID: \"6936952c-09f0-48fd-8832-38c18202ae81\") " pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035402 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-multus-cni-dir\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035428 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/331140be-ed04-4023-b244-31f5817b8803-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035448 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl2v8\" (UniqueName: \"kubernetes.io/projected/331140be-ed04-4023-b244-31f5817b8803-kube-api-access-rl2v8\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035469 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-openvswitch\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035488 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-ovn\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035508 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-env-overrides\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035524 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zd8j\" (UniqueName: \"kubernetes.io/projected/58772108-964d-4d0c-90a4-70ad5fe1da2d-kube-api-access-9zd8j\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035540 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2080026c-9eee-4863-b62d-e9ce4d4525dd-cni-binary-copy\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035567 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-var-lib-cni-bin\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035601 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-var-lib-cni-multus\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.035626 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/331140be-ed04-4023-b244-31f5817b8803-cnibin\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.049400 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f50cd86d2d127219a031cf1cda5eb9c0cddbb0cdf563cd7adbb9a43b3809bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:29Z\\\",\\\"message\\\":\\\"W1006 06:45:19.388973 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 06:45:19.389328 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759733119 cert, and key in /tmp/serving-cert-2046128313/serving-signer.crt, /tmp/serving-cert-2046128313/serving-signer.key\\\\nI1006 06:45:19.582383 1 observer_polling.go:159] Starting file observer\\\\nW1006 06:45:19.587180 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1006 06:45:19.587535 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:19.588337 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2046128313/tls.crt::/tmp/serving-cert-2046128313/tls.key\\\\\\\"\\\\nF1006 06:45:29.814735 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.063583 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.086205 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.100197 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.117493 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.133584 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.135895 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136051 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-run-netns\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136115 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-run-netns\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.136145 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:45:40.136094544 +0000 UTC m=+24.650835612 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136233 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-etc-openvswitch\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136262 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/331140be-ed04-4023-b244-31f5817b8803-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136286 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-etc-openvswitch\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136299 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-systemd\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136319 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-os-release\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136346 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6936952c-09f0-48fd-8832-38c18202ae81-mcd-auth-proxy-config\") pod \"machine-config-daemon-tpgm6\" (UID: \"6936952c-09f0-48fd-8832-38c18202ae81\") " pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136389 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-multus-cni-dir\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136429 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/331140be-ed04-4023-b244-31f5817b8803-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136464 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl2v8\" (UniqueName: \"kubernetes.io/projected/331140be-ed04-4023-b244-31f5817b8803-kube-api-access-rl2v8\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136481 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-openvswitch\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136498 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-ovn\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136516 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-systemd\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136628 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-log-socket\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136651 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136673 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-env-overrides\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136712 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zd8j\" (UniqueName: \"kubernetes.io/projected/58772108-964d-4d0c-90a4-70ad5fe1da2d-kube-api-access-9zd8j\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136717 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-os-release\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136731 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2080026c-9eee-4863-b62d-e9ce4d4525dd-cni-binary-copy\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136720 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-log-socket\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136763 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-multus-cni-dir\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136751 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-var-lib-cni-bin\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136711 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-openvswitch\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136827 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-var-lib-cni-bin\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136835 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-ovn\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136900 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-var-lib-cni-multus\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136906 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.136822 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-var-lib-cni-multus\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137093 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/331140be-ed04-4023-b244-31f5817b8803-cnibin\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137137 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xhsv\" (UniqueName: \"kubernetes.io/projected/2080026c-9eee-4863-b62d-e9ce4d4525dd-kube-api-access-2xhsv\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137174 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovn-node-metrics-cert\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137182 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/331140be-ed04-4023-b244-31f5817b8803-cnibin\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137208 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovnkube-script-lib\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137242 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-var-lib-kubelet\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137276 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-cni-netd\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137313 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rwfz\" (UniqueName: \"kubernetes.io/projected/453226ed-506e-48cb-89a8-a03ca92660e0-kube-api-access-8rwfz\") pod \"node-resolver-689qf\" (UID: \"453226ed-506e-48cb-89a8-a03ca92660e0\") " pod="openshift-dns/node-resolver-689qf" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137348 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-run-ovn-kubernetes\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137359 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-var-lib-kubelet\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137411 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-run-ovn-kubernetes\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137422 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-multus-conf-dir\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137459 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-multus-conf-dir\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137464 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovnkube-config\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137504 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-cnibin\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137540 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-run-k8s-cni-cncf-io\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137578 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/331140be-ed04-4023-b244-31f5817b8803-cni-binary-copy\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137602 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-cnibin\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137614 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-run-k8s-cni-cncf-io\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137624 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137664 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-node-log\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137696 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-cni-bin\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137723 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-node-log\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.137735 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137764 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-cni-bin\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.137791 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.137808 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137824 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-cni-netd\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.137878 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:40.137867165 +0000 UTC m=+24.652608363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137733 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9zb5\" (UniqueName: \"kubernetes.io/projected/6936952c-09f0-48fd-8832-38c18202ae81-kube-api-access-s9zb5\") pod \"machine-config-daemon-tpgm6\" (UID: \"6936952c-09f0-48fd-8832-38c18202ae81\") " pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.137974 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-kubelet\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138003 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-systemd-units\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138026 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-hostroot\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138059 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-kubelet\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138070 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-systemd-units\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138122 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/331140be-ed04-4023-b244-31f5817b8803-system-cni-dir\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138148 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-hostroot\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138183 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138202 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/331140be-ed04-4023-b244-31f5817b8803-system-cni-dir\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138253 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-var-lib-openvswitch\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138302 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2080026c-9eee-4863-b62d-e9ce4d4525dd-multus-daemon-config\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.138313 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.138402 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:40.138362447 +0000 UTC m=+24.653103545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138321 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-var-lib-openvswitch\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138423 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-run-multus-certs\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138364 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-run-multus-certs\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138535 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.138649 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138708 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/453226ed-506e-48cb-89a8-a03ca92660e0-hosts-file\") pod \"node-resolver-689qf\" (UID: \"453226ed-506e-48cb-89a8-a03ca92660e0\") " pod="openshift-dns/node-resolver-689qf" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.138726 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:40.138702275 +0000 UTC m=+24.653443463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138566 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/453226ed-506e-48cb-89a8-a03ca92660e0-hosts-file\") pod \"node-resolver-689qf\" (UID: \"453226ed-506e-48cb-89a8-a03ca92660e0\") " pod="openshift-dns/node-resolver-689qf" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138790 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6936952c-09f0-48fd-8832-38c18202ae81-proxy-tls\") pod \"machine-config-daemon-tpgm6\" (UID: \"6936952c-09f0-48fd-8832-38c18202ae81\") " pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138822 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-system-cni-dir\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138940 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-system-cni-dir\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.138859 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.139013 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6936952c-09f0-48fd-8832-38c18202ae81-rootfs\") pod \"machine-config-daemon-tpgm6\" (UID: \"6936952c-09f0-48fd-8832-38c18202ae81\") " pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.139075 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-etc-kubernetes\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.139090 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6936952c-09f0-48fd-8832-38c18202ae81-rootfs\") pod \"machine-config-daemon-tpgm6\" (UID: \"6936952c-09f0-48fd-8832-38c18202ae81\") " pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.139021 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.139106 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-etc-kubernetes\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.139140 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.139160 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.139216 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:40.139202836 +0000 UTC m=+24.653944054 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.139259 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-multus-socket-dir-parent\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.139290 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-run-netns\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.139304 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-multus-socket-dir-parent\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.139318 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/331140be-ed04-4023-b244-31f5817b8803-os-release\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.139337 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2080026c-9eee-4863-b62d-e9ce4d4525dd-host-run-netns\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.139353 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-slash\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.139397 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/331140be-ed04-4023-b244-31f5817b8803-os-release\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.139454 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-slash\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.149587 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/331140be-ed04-4023-b244-31f5817b8803-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.153949 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6936952c-09f0-48fd-8832-38c18202ae81-mcd-auth-proxy-config\") pod \"machine-config-daemon-tpgm6\" (UID: \"6936952c-09f0-48fd-8832-38c18202ae81\") " pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.154430 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.157576 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovn-node-metrics-cert\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.157606 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zd8j\" (UniqueName: \"kubernetes.io/projected/58772108-964d-4d0c-90a4-70ad5fe1da2d-kube-api-access-9zd8j\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.158150 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rwfz\" (UniqueName: \"kubernetes.io/projected/453226ed-506e-48cb-89a8-a03ca92660e0-kube-api-access-8rwfz\") pod \"node-resolver-689qf\" (UID: \"453226ed-506e-48cb-89a8-a03ca92660e0\") " pod="openshift-dns/node-resolver-689qf" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.158598 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-env-overrides\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.158803 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2080026c-9eee-4863-b62d-e9ce4d4525dd-cni-binary-copy\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.161310 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl2v8\" (UniqueName: \"kubernetes.io/projected/331140be-ed04-4023-b244-31f5817b8803-kube-api-access-rl2v8\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.161720 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6936952c-09f0-48fd-8832-38c18202ae81-proxy-tls\") pod \"machine-config-daemon-tpgm6\" (UID: \"6936952c-09f0-48fd-8832-38c18202ae81\") " pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.162554 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9zb5\" (UniqueName: \"kubernetes.io/projected/6936952c-09f0-48fd-8832-38c18202ae81-kube-api-access-s9zb5\") pod \"machine-config-daemon-tpgm6\" (UID: \"6936952c-09f0-48fd-8832-38c18202ae81\") " pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.165674 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2080026c-9eee-4863-b62d-e9ce4d4525dd-multus-daemon-config\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.174241 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.190540 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.198586 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovnkube-script-lib\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.198749 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovnkube-config\") pod \"ovnkube-node-587xc\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.199418 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xhsv\" (UniqueName: \"kubernetes.io/projected/2080026c-9eee-4863-b62d-e9ce4d4525dd-kube-api-access-2xhsv\") pod \"multus-zpn9l\" (UID: \"2080026c-9eee-4863-b62d-e9ce4d4525dd\") " pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.203989 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.208127 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/331140be-ed04-4023-b244-31f5817b8803-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.208607 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/331140be-ed04-4023-b244-31f5817b8803-cni-binary-copy\") pod \"multus-additional-cni-plugins-m79r8\" (UID: \"331140be-ed04-4023-b244-31f5817b8803\") " pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.217993 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.227576 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.227991 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.228416 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.228489 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.228533 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.228574 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.230449 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.230986 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.232512 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.232968 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f50cd86d2d127219a031cf1cda5eb9c0cddbb0cdf563cd7adbb9a43b3809bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:29Z\\\",\\\"message\\\":\\\"W1006 06:45:19.388973 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 06:45:19.389328 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759733119 cert, and key in /tmp/serving-cert-2046128313/serving-signer.crt, /tmp/serving-cert-2046128313/serving-signer.key\\\\nI1006 06:45:19.582383 1 observer_polling.go:159] Starting file observer\\\\nW1006 06:45:19.587180 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1006 06:45:19.587535 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:19.588337 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2046128313/tls.crt::/tmp/serving-cert-2046128313/tls.key\\\\\\\"\\\\nF1006 06:45:29.814735 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.233155 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.234101 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.234610 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.235202 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.236122 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.236797 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.237784 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.238421 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.241574 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.242153 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.243149 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.243770 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.244759 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.245353 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.245441 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.245888 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.246922 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.247531 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.248892 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.249719 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.250236 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.251471 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.251991 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.253055 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.254239 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.254752 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.255921 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.256469 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.256952 4845 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.257447 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.259304 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.259941 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.261212 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.263414 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.265127 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.265528 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.271288 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.271904 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.273070 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.273774 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.273849 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.274130 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-689qf" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.274408 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.277276 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.278533 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.279180 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.280140 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.280743 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.281783 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.282845 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.283573 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.283844 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.284501 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.285549 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.286856 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: W1006 06:45:38.286254 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58772108_964d_4d0c_90a4_70ad5fe1da2d.slice/crio-1edac2b123f605cafe95364a4698dd6563a25c0d8c5d7af759cef5ee95439e65 WatchSource:0}: Error finding container 1edac2b123f605cafe95364a4698dd6563a25c0d8c5d7af759cef5ee95439e65: Status 404 returned error can't find the container with id 1edac2b123f605cafe95364a4698dd6563a25c0d8c5d7af759cef5ee95439e65 Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.287505 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.288508 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.292680 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zpn9l" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.293842 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.300688 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m79r8" Oct 06 06:45:38 crc kubenswrapper[4845]: W1006 06:45:38.302292 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6936952c_09f0_48fd_8832_38c18202ae81.slice/crio-c6069ad02119e159d649b7fa88304cbc873fa3027904e0948729693535c008a5 WatchSource:0}: Error finding container c6069ad02119e159d649b7fa88304cbc873fa3027904e0948729693535c008a5: Status 404 returned error can't find the container with id c6069ad02119e159d649b7fa88304cbc873fa3027904e0948729693535c008a5 Oct 06 06:45:38 crc kubenswrapper[4845]: W1006 06:45:38.325435 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod331140be_ed04_4023_b244_31f5817b8803.slice/crio-4ad2d5e0fa53c381dcf1c17ecfc81a57f0809380ea7c243079e60707baf04b31 WatchSource:0}: Error finding container 4ad2d5e0fa53c381dcf1c17ecfc81a57f0809380ea7c243079e60707baf04b31: Status 404 returned error can't find the container with id 4ad2d5e0fa53c381dcf1c17ecfc81a57f0809380ea7c243079e60707baf04b31 Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.388218 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" event={"ID":"331140be-ed04-4023-b244-31f5817b8803","Type":"ContainerStarted","Data":"4ad2d5e0fa53c381dcf1c17ecfc81a57f0809380ea7c243079e60707baf04b31"} Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.389388 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"1edac2b123f605cafe95364a4698dd6563a25c0d8c5d7af759cef5ee95439e65"} Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.391332 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.394889 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zpn9l" event={"ID":"2080026c-9eee-4863-b62d-e9ce4d4525dd","Type":"ContainerStarted","Data":"70cbdfe5351dfbe3de8e1882cf743c27ba778298bb680bb77d778b010392aa31"} Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.395886 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"c6069ad02119e159d649b7fa88304cbc873fa3027904e0948729693535c008a5"} Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.396116 4845 scope.go:117] "RemoveContainer" containerID="526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb" Oct 06 06:45:38 crc kubenswrapper[4845]: E1006 06:45:38.396277 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.397669 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-689qf" event={"ID":"453226ed-506e-48cb-89a8-a03ca92660e0","Type":"ContainerStarted","Data":"90b4e4115cff9a381150fa180afc1e3d4ed21537d968f656795774eb81bbb5c6"} Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.398782 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.407366 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.419230 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.445770 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.465846 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.485207 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.497723 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.510645 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.531084 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.547639 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.562783 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.591512 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.620431 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.676049 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.691807 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.705829 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.717457 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.728552 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.742454 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.754973 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.771962 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.782442 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.795280 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.813222 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:38 crc kubenswrapper[4845]: I1006 06:45:38.829076 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:38Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.081227 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.085005 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.089854 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.095880 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.108185 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.124228 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.139403 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.152138 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.164779 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.176439 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.191188 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.203693 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.226369 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.242014 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.255702 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.268004 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.281060 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.294945 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.308053 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.327595 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.340902 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.355050 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.369711 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.381429 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.406665 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7"} Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.406724 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca"} Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.408906 4845 generic.go:334] "Generic (PLEG): container finished" podID="331140be-ed04-4023-b244-31f5817b8803" containerID="0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69" exitCode=0 Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.408992 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" event={"ID":"331140be-ed04-4023-b244-31f5817b8803","Type":"ContainerDied","Data":"0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69"} Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.409781 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.410291 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084" exitCode=0 Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.410316 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084"} Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.412070 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-689qf" event={"ID":"453226ed-506e-48cb-89a8-a03ca92660e0","Type":"ContainerStarted","Data":"93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68"} Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.415020 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f"} Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.416901 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zpn9l" event={"ID":"2080026c-9eee-4863-b62d-e9ce4d4525dd","Type":"ContainerStarted","Data":"8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992"} Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.418548 4845 scope.go:117] "RemoveContainer" containerID="526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb" Oct 06 06:45:39 crc kubenswrapper[4845]: E1006 06:45:39.418835 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.451017 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.490725 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.524021 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.531515 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.563676 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.605093 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.644893 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.684842 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.723997 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.765961 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.802096 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.847224 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.883327 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.925664 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:39 crc kubenswrapper[4845]: I1006 06:45:39.968600 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:39Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.002796 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.045047 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.157862 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.157942 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.158000 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.158039 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158103 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158128 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:45:44.158091817 +0000 UTC m=+28.672832855 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158166 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:44.158152118 +0000 UTC m=+28.672893136 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.158226 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158261 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158278 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158289 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158323 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:44.158308122 +0000 UTC m=+28.673049120 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158385 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158394 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158401 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158419 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:44.158413144 +0000 UTC m=+28.673154152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158472 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.158548 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:44.158536367 +0000 UTC m=+28.673277395 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.228960 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.228923 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.229109 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.229208 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.229214 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.229305 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.421601 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" event={"ID":"331140be-ed04-4023-b244-31f5817b8803","Type":"ContainerStarted","Data":"2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3"} Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.432603 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52"} Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.432654 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429"} Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.432665 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6"} Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.432673 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0"} Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.432682 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e"} Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.432691 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413"} Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.433820 4845 scope.go:117] "RemoveContainer" containerID="526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb" Oct 06 06:45:40 crc kubenswrapper[4845]: E1006 06:45:40.433995 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.438575 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.452140 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.474355 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.496820 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.520937 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.542663 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.559395 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.574987 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.585849 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.596614 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.609819 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.624040 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:40 crc kubenswrapper[4845]: I1006 06:45:40.644833 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:40Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.437223 4845 generic.go:334] "Generic (PLEG): container finished" podID="331140be-ed04-4023-b244-31f5817b8803" containerID="2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3" exitCode=0 Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.437296 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" event={"ID":"331140be-ed04-4023-b244-31f5817b8803","Type":"ContainerDied","Data":"2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3"} Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.437826 4845 scope.go:117] "RemoveContainer" containerID="526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb" Oct 06 06:45:41 crc kubenswrapper[4845]: E1006 06:45:41.437945 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.457846 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.474188 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.491312 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.506883 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.517904 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.529296 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.541982 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.552170 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.566451 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.581988 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.601392 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.612355 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:41 crc kubenswrapper[4845]: I1006 06:45:41.624053 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:41Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.226298 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.226298 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:42 crc kubenswrapper[4845]: E1006 06:45:42.226760 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.226411 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:42 crc kubenswrapper[4845]: E1006 06:45:42.226885 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:45:42 crc kubenswrapper[4845]: E1006 06:45:42.227064 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.441689 4845 generic.go:334] "Generic (PLEG): container finished" podID="331140be-ed04-4023-b244-31f5817b8803" containerID="f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a" exitCode=0 Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.441874 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" event={"ID":"331140be-ed04-4023-b244-31f5817b8803","Type":"ContainerDied","Data":"f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a"} Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.460653 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.476067 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.490967 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.504877 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.519713 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.541144 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.554922 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.571452 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.586865 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.599551 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.620031 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.631555 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.646733 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.812323 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.814182 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.814228 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.814239 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.814366 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.821342 4845 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.821572 4845 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.822705 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.822764 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.822783 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.822815 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.822835 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:42Z","lastTransitionTime":"2025-10-06T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:42 crc kubenswrapper[4845]: E1006 06:45:42.843924 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.848758 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.848811 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.848831 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.848860 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.848882 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:42Z","lastTransitionTime":"2025-10-06T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:42 crc kubenswrapper[4845]: E1006 06:45:42.868185 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.873701 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.873750 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.873760 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.873777 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.873789 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:42Z","lastTransitionTime":"2025-10-06T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:42 crc kubenswrapper[4845]: E1006 06:45:42.889224 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.894014 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.894077 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.894097 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.894125 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.894149 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:42Z","lastTransitionTime":"2025-10-06T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:42 crc kubenswrapper[4845]: E1006 06:45:42.909609 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.913404 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.913533 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.913595 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.913702 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.913781 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:42Z","lastTransitionTime":"2025-10-06T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:42 crc kubenswrapper[4845]: E1006 06:45:42.926803 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:42Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:42 crc kubenswrapper[4845]: E1006 06:45:42.927269 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.929011 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.929079 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.929099 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.929130 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:42 crc kubenswrapper[4845]: I1006 06:45:42.929155 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:42Z","lastTransitionTime":"2025-10-06T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.031998 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.032049 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.032062 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.032080 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.032090 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:43Z","lastTransitionTime":"2025-10-06T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.135267 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.135349 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.135397 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.135429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.135448 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:43Z","lastTransitionTime":"2025-10-06T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.238685 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.238734 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.238744 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.238771 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.239028 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:43Z","lastTransitionTime":"2025-10-06T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.374942 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.375003 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.375022 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.375052 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.375071 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:43Z","lastTransitionTime":"2025-10-06T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.447909 4845 generic.go:334] "Generic (PLEG): container finished" podID="331140be-ed04-4023-b244-31f5817b8803" containerID="b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636" exitCode=0 Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.448125 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" event={"ID":"331140be-ed04-4023-b244-31f5817b8803","Type":"ContainerDied","Data":"b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636"} Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.457428 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d"} Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.463750 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.469338 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-8bzqb"] Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.469772 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8bzqb" Oct 06 06:45:43 crc kubenswrapper[4845]: W1006 06:45:43.472804 4845 reflector.go:561] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": failed to list *v1.Secret: secrets "node-ca-dockercfg-4777p" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Oct 06 06:45:43 crc kubenswrapper[4845]: E1006 06:45:43.472846 4845 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4777p\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-ca-dockercfg-4777p\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.472945 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.473092 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.473093 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.476562 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.478353 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.478512 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.478617 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.478737 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.478816 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:43Z","lastTransitionTime":"2025-10-06T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.492452 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.504448 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.527113 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.544484 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.560473 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.579694 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.582112 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.582144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.582152 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.582170 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.582180 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:43Z","lastTransitionTime":"2025-10-06T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.592635 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.596480 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/999aace8-0c91-47c0-aee3-439e419a45c8-host\") pod \"node-ca-8bzqb\" (UID: \"999aace8-0c91-47c0-aee3-439e419a45c8\") " pod="openshift-image-registry/node-ca-8bzqb" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.596529 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jbr7\" (UniqueName: \"kubernetes.io/projected/999aace8-0c91-47c0-aee3-439e419a45c8-kube-api-access-5jbr7\") pod \"node-ca-8bzqb\" (UID: \"999aace8-0c91-47c0-aee3-439e419a45c8\") " pod="openshift-image-registry/node-ca-8bzqb" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.596580 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/999aace8-0c91-47c0-aee3-439e419a45c8-serviceca\") pod \"node-ca-8bzqb\" (UID: \"999aace8-0c91-47c0-aee3-439e419a45c8\") " pod="openshift-image-registry/node-ca-8bzqb" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.604409 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.619108 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.631644 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.648762 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.662095 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.679321 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.687726 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.687762 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.687770 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.687784 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.687793 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:43Z","lastTransitionTime":"2025-10-06T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.698055 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/999aace8-0c91-47c0-aee3-439e419a45c8-host\") pod \"node-ca-8bzqb\" (UID: \"999aace8-0c91-47c0-aee3-439e419a45c8\") " pod="openshift-image-registry/node-ca-8bzqb" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.698092 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jbr7\" (UniqueName: \"kubernetes.io/projected/999aace8-0c91-47c0-aee3-439e419a45c8-kube-api-access-5jbr7\") pod \"node-ca-8bzqb\" (UID: \"999aace8-0c91-47c0-aee3-439e419a45c8\") " pod="openshift-image-registry/node-ca-8bzqb" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.698112 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/999aace8-0c91-47c0-aee3-439e419a45c8-serviceca\") pod \"node-ca-8bzqb\" (UID: \"999aace8-0c91-47c0-aee3-439e419a45c8\") " pod="openshift-image-registry/node-ca-8bzqb" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.698127 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/999aace8-0c91-47c0-aee3-439e419a45c8-host\") pod \"node-ca-8bzqb\" (UID: \"999aace8-0c91-47c0-aee3-439e419a45c8\") " pod="openshift-image-registry/node-ca-8bzqb" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.698521 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.699038 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/999aace8-0c91-47c0-aee3-439e419a45c8-serviceca\") pod \"node-ca-8bzqb\" (UID: \"999aace8-0c91-47c0-aee3-439e419a45c8\") " pod="openshift-image-registry/node-ca-8bzqb" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.715110 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.726077 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jbr7\" (UniqueName: \"kubernetes.io/projected/999aace8-0c91-47c0-aee3-439e419a45c8-kube-api-access-5jbr7\") pod \"node-ca-8bzqb\" (UID: \"999aace8-0c91-47c0-aee3-439e419a45c8\") " pod="openshift-image-registry/node-ca-8bzqb" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.735496 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.750139 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.762809 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.777915 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.790723 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.790779 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.790797 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.790821 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.790841 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:43Z","lastTransitionTime":"2025-10-06T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.795504 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.809696 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.824769 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.847655 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.867304 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.892514 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:43Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.893333 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.893409 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.893424 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.893447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.893460 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:43Z","lastTransitionTime":"2025-10-06T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.996694 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.996762 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.996775 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.996796 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:43 crc kubenswrapper[4845]: I1006 06:45:43.996811 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:43Z","lastTransitionTime":"2025-10-06T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.099175 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.099235 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.099247 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.099268 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.099282 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:44Z","lastTransitionTime":"2025-10-06T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.201510 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.201559 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.201573 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.201591 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.201604 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:44Z","lastTransitionTime":"2025-10-06T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.203961 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.204017 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.204037 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.204057 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204093 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:45:52.204062577 +0000 UTC m=+36.718803585 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204176 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204199 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204459 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:52.204438806 +0000 UTC m=+36.719179814 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204482 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:52.204473887 +0000 UTC m=+36.719214895 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.204510 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204522 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204540 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204553 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204595 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:52.20458755 +0000 UTC m=+36.719328558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204631 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204644 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204655 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.204686 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 06:45:52.204679352 +0000 UTC m=+36.719420350 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.226629 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.226685 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.226643 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.226803 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.226970 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:45:44 crc kubenswrapper[4845]: E1006 06:45:44.226888 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.303254 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.303294 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.303304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.303321 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.303338 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:44Z","lastTransitionTime":"2025-10-06T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.406239 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.406270 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.406279 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.406294 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.406303 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:44Z","lastTransitionTime":"2025-10-06T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.464394 4845 generic.go:334] "Generic (PLEG): container finished" podID="331140be-ed04-4023-b244-31f5817b8803" containerID="bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698" exitCode=0 Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.464436 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" event={"ID":"331140be-ed04-4023-b244-31f5817b8803","Type":"ContainerDied","Data":"bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698"} Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.482455 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.509003 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.509046 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.509060 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.509083 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.509099 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:44Z","lastTransitionTime":"2025-10-06T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.523515 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.536766 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.556604 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.570049 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.584909 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.599698 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.611485 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.611555 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.611489 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.611576 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.611762 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.611784 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:44Z","lastTransitionTime":"2025-10-06T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.629184 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.651544 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.666185 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.679600 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.696624 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.711024 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:44Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.715263 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.715324 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.715338 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.715401 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.715420 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:44Z","lastTransitionTime":"2025-10-06T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.725997 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.732609 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8bzqb" Oct 06 06:45:44 crc kubenswrapper[4845]: W1006 06:45:44.750615 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod999aace8_0c91_47c0_aee3_439e419a45c8.slice/crio-1fc87edbe22880a8dd2c8dbf01a498c14f44fe55aa8200963e5f9a92eb4d538a WatchSource:0}: Error finding container 1fc87edbe22880a8dd2c8dbf01a498c14f44fe55aa8200963e5f9a92eb4d538a: Status 404 returned error can't find the container with id 1fc87edbe22880a8dd2c8dbf01a498c14f44fe55aa8200963e5f9a92eb4d538a Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.818351 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.818416 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.818430 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.818452 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.818468 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:44Z","lastTransitionTime":"2025-10-06T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.922481 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.922516 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.922525 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.922540 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:44 crc kubenswrapper[4845]: I1006 06:45:44.922550 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:44Z","lastTransitionTime":"2025-10-06T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.026407 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.026441 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.026450 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.026466 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.026475 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:45Z","lastTransitionTime":"2025-10-06T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.130567 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.130617 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.130628 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.130647 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.130661 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:45Z","lastTransitionTime":"2025-10-06T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.234077 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.234128 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.234140 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.234164 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.234177 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:45Z","lastTransitionTime":"2025-10-06T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.336614 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.336659 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.336668 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.336686 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.336697 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:45Z","lastTransitionTime":"2025-10-06T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.439318 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.439359 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.439382 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.439401 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.439411 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:45Z","lastTransitionTime":"2025-10-06T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.472405 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.472780 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.477130 4845 generic.go:334] "Generic (PLEG): container finished" podID="331140be-ed04-4023-b244-31f5817b8803" containerID="91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212" exitCode=0 Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.477268 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" event={"ID":"331140be-ed04-4023-b244-31f5817b8803","Type":"ContainerDied","Data":"91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.481351 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8bzqb" event={"ID":"999aace8-0c91-47c0-aee3-439e419a45c8","Type":"ContainerStarted","Data":"b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.481436 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8bzqb" event={"ID":"999aace8-0c91-47c0-aee3-439e419a45c8","Type":"ContainerStarted","Data":"1fc87edbe22880a8dd2c8dbf01a498c14f44fe55aa8200963e5f9a92eb4d538a"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.493461 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.504614 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.510930 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.524158 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.535555 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.541341 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.541392 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.541405 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.541423 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.541434 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:45Z","lastTransitionTime":"2025-10-06T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.547408 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.558795 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.569941 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.580254 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.593684 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.607530 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.626338 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.639286 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.648995 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.649037 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.649046 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.649062 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.649071 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:45Z","lastTransitionTime":"2025-10-06T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.663574 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.675939 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.689135 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.703643 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.746829 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.752985 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.753015 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.753025 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.753042 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.753051 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:45Z","lastTransitionTime":"2025-10-06T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.770918 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.791803 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.804153 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.814123 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.822717 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.834332 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.844956 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.855201 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.855242 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.855252 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.855269 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.855278 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:45Z","lastTransitionTime":"2025-10-06T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.862725 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.873117 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.883819 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.893981 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:45Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.958163 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.958205 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.958215 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.958231 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:45 crc kubenswrapper[4845]: I1006 06:45:45.958245 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:45Z","lastTransitionTime":"2025-10-06T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.061245 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.061309 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.061323 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.061345 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.061363 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:46Z","lastTransitionTime":"2025-10-06T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.164139 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.164215 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.164241 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.164276 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.164299 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:46Z","lastTransitionTime":"2025-10-06T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.225837 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.225900 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.225856 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:46 crc kubenswrapper[4845]: E1006 06:45:46.226001 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:45:46 crc kubenswrapper[4845]: E1006 06:45:46.226168 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:45:46 crc kubenswrapper[4845]: E1006 06:45:46.226286 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.248303 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.266261 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.266309 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.266319 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.266338 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.266350 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:46Z","lastTransitionTime":"2025-10-06T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.281086 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.296524 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.308581 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.326391 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.344540 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.359277 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.369449 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.369494 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.369512 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.369535 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.369548 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:46Z","lastTransitionTime":"2025-10-06T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.373632 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.388627 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.406946 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.421791 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.435761 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.447330 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.455940 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.472040 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.472074 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.472087 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.472193 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.472207 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:46Z","lastTransitionTime":"2025-10-06T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.488968 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" event={"ID":"331140be-ed04-4023-b244-31f5817b8803","Type":"ContainerStarted","Data":"42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787"} Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.489076 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.489805 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.507032 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.521545 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.524494 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.541309 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.554399 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.569266 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.574558 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.574610 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.574622 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.574647 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.574659 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:46Z","lastTransitionTime":"2025-10-06T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.586766 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.597230 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.606120 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.621286 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.634830 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.663331 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.677030 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.677460 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.677547 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.677567 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.677631 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.677659 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:46Z","lastTransitionTime":"2025-10-06T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.690514 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.703708 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.717675 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.735789 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.752607 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.765845 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.777180 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.780742 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.780841 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.780916 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.781006 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.781084 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:46Z","lastTransitionTime":"2025-10-06T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.791756 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.806103 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.819753 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.836766 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.855331 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.867696 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.884165 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.884431 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.884582 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.884738 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.884877 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:46Z","lastTransitionTime":"2025-10-06T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.884162 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.897751 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.916509 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:46Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.988425 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.988682 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.989023 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.989274 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:46 crc kubenswrapper[4845]: I1006 06:45:46.989412 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:46Z","lastTransitionTime":"2025-10-06T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.091935 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.092169 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.092231 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.092303 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.092383 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:47Z","lastTransitionTime":"2025-10-06T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.194770 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.195024 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.195108 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.195191 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.195278 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:47Z","lastTransitionTime":"2025-10-06T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.297332 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.297394 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.297406 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.297426 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.297439 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:47Z","lastTransitionTime":"2025-10-06T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.400025 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.400067 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.400092 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.400112 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.400122 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:47Z","lastTransitionTime":"2025-10-06T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.492402 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.501830 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.501861 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.501873 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.501888 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.501898 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:47Z","lastTransitionTime":"2025-10-06T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.604504 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.604565 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.604580 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.604603 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.604617 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:47Z","lastTransitionTime":"2025-10-06T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.706816 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.706872 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.706887 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.706907 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.706922 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:47Z","lastTransitionTime":"2025-10-06T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.809413 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.809459 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.809473 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.809493 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.809509 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:47Z","lastTransitionTime":"2025-10-06T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.911912 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.911960 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.911969 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.912003 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:47 crc kubenswrapper[4845]: I1006 06:45:47.912014 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:47Z","lastTransitionTime":"2025-10-06T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.015170 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.015227 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.015237 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.015255 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.015266 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:48Z","lastTransitionTime":"2025-10-06T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.118576 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.118612 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.118621 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.118639 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.118647 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:48Z","lastTransitionTime":"2025-10-06T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.221578 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.221624 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.221634 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.221650 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.221660 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:48Z","lastTransitionTime":"2025-10-06T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.226122 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.226191 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.226283 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:48 crc kubenswrapper[4845]: E1006 06:45:48.226479 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:45:48 crc kubenswrapper[4845]: E1006 06:45:48.226639 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:45:48 crc kubenswrapper[4845]: E1006 06:45:48.226787 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.328687 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.328734 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.328749 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.328767 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.328784 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:48Z","lastTransitionTime":"2025-10-06T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.431416 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.431470 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.431487 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.431510 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.431528 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:48Z","lastTransitionTime":"2025-10-06T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.497444 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/0.log" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.500188 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2" exitCode=1 Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.500251 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2"} Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.501074 4845 scope.go:117] "RemoveContainer" containerID="f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.518515 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.533992 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.534034 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.534045 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.534060 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.534070 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:48Z","lastTransitionTime":"2025-10-06T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.540887 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:47Z\\\",\\\"message\\\":\\\"r.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 06:45:47.687487 6107 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 06:45:47.687498 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 06:45:47.687536 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 06:45:47.687569 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 06:45:47.687568 6107 factory.go:656] Stopping watch factory\\\\nI1006 06:45:47.687591 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 06:45:47.687322 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 06:45:47.687789 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 06:45:47.687612 6107 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 06:45:47.687623 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 06:45:47.687898 6107 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 06:45:47.687989 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.543327 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.556956 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.574507 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.586226 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.599993 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.613854 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.627486 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.639785 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.640713 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.640737 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.640747 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.640764 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.640775 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:48Z","lastTransitionTime":"2025-10-06T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.652016 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.664289 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.676635 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.685390 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.701266 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:48Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.743623 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.743701 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.743729 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.743762 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.743785 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:48Z","lastTransitionTime":"2025-10-06T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.847108 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.847186 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.847211 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.847245 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.847268 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:48Z","lastTransitionTime":"2025-10-06T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.949978 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.950033 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.950055 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.950082 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:48 crc kubenswrapper[4845]: I1006 06:45:48.950096 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:48Z","lastTransitionTime":"2025-10-06T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.052799 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.052859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.052875 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.052898 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.052919 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:49Z","lastTransitionTime":"2025-10-06T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.155232 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.155272 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.155283 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.155300 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.155310 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:49Z","lastTransitionTime":"2025-10-06T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.258244 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.258289 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.258299 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.258319 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.258330 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:49Z","lastTransitionTime":"2025-10-06T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.361019 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.361092 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.361109 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.361134 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.361174 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:49Z","lastTransitionTime":"2025-10-06T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.464201 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.464257 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.464266 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.464296 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.464306 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:49Z","lastTransitionTime":"2025-10-06T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.507770 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/0.log" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.511090 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b"} Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.511574 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.536326 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.552005 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.567124 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.567165 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.567176 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.567194 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.567204 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:49Z","lastTransitionTime":"2025-10-06T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.569976 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.581904 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.596337 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.608199 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.619041 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.627290 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.642225 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.655423 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.669399 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.669438 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.669447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.669463 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.669483 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:49Z","lastTransitionTime":"2025-10-06T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.671087 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:47Z\\\",\\\"message\\\":\\\"r.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 06:45:47.687487 6107 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 06:45:47.687498 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 06:45:47.687536 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 06:45:47.687569 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 06:45:47.687568 6107 factory.go:656] Stopping watch factory\\\\nI1006 06:45:47.687591 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 06:45:47.687322 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 06:45:47.687789 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 06:45:47.687612 6107 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 06:45:47.687623 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 06:45:47.687898 6107 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 06:45:47.687989 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.681172 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.692329 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.700388 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.771993 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.772058 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.772079 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.772104 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.772118 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:49Z","lastTransitionTime":"2025-10-06T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.874681 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.874735 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.874751 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.874776 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.874795 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:49Z","lastTransitionTime":"2025-10-06T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.977859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.977918 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.977934 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.977957 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:49 crc kubenswrapper[4845]: I1006 06:45:49.977974 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:49Z","lastTransitionTime":"2025-10-06T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.080181 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.080239 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.080253 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.080275 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.080291 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:50Z","lastTransitionTime":"2025-10-06T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.182545 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.182586 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.182604 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.182627 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.182647 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:50Z","lastTransitionTime":"2025-10-06T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.226409 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.226446 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:50 crc kubenswrapper[4845]: E1006 06:45:50.226549 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:45:50 crc kubenswrapper[4845]: E1006 06:45:50.226681 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.226884 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:50 crc kubenswrapper[4845]: E1006 06:45:50.226977 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.284951 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.285002 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.285016 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.285039 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.285052 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:50Z","lastTransitionTime":"2025-10-06T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.387965 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.388316 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.388545 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.388761 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.388904 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:50Z","lastTransitionTime":"2025-10-06T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.491849 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.491911 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.491926 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.491947 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.491959 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:50Z","lastTransitionTime":"2025-10-06T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.516430 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/1.log" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.517861 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/0.log" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.521340 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b" exitCode=1 Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.521420 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b"} Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.521475 4845 scope.go:117] "RemoveContainer" containerID="f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.522264 4845 scope.go:117] "RemoveContainer" containerID="37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b" Oct 06 06:45:50 crc kubenswrapper[4845]: E1006 06:45:50.522519 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.538018 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.557620 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.576247 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.593955 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.596026 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.596088 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.596108 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.596134 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.596153 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:50Z","lastTransitionTime":"2025-10-06T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.617684 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.645634 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.671662 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j"] Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.672715 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.684927 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.685283 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.685692 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.698252 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ce71ec75-9d46-43ff-a08e-430ef60a6d9e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-smc4j\" (UID: \"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.698420 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lplhk\" (UniqueName: \"kubernetes.io/projected/ce71ec75-9d46-43ff-a08e-430ef60a6d9e-kube-api-access-lplhk\") pod \"ovnkube-control-plane-749d76644c-smc4j\" (UID: \"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.698534 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ce71ec75-9d46-43ff-a08e-430ef60a6d9e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-smc4j\" (UID: \"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.698590 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ce71ec75-9d46-43ff-a08e-430ef60a6d9e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-smc4j\" (UID: \"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.699322 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.699403 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.699419 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.699442 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.699458 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:50Z","lastTransitionTime":"2025-10-06T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.710595 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.725575 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.739510 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.755316 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.775075 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.793696 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.799467 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ce71ec75-9d46-43ff-a08e-430ef60a6d9e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-smc4j\" (UID: \"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.799546 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ce71ec75-9d46-43ff-a08e-430ef60a6d9e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-smc4j\" (UID: \"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.799598 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ce71ec75-9d46-43ff-a08e-430ef60a6d9e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-smc4j\" (UID: \"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.799652 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lplhk\" (UniqueName: \"kubernetes.io/projected/ce71ec75-9d46-43ff-a08e-430ef60a6d9e-kube-api-access-lplhk\") pod \"ovnkube-control-plane-749d76644c-smc4j\" (UID: \"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.801633 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ce71ec75-9d46-43ff-a08e-430ef60a6d9e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-smc4j\" (UID: \"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.801726 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.802147 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.801779 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ce71ec75-9d46-43ff-a08e-430ef60a6d9e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-smc4j\" (UID: \"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.802352 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.802551 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.802580 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:50Z","lastTransitionTime":"2025-10-06T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.812733 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ce71ec75-9d46-43ff-a08e-430ef60a6d9e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-smc4j\" (UID: \"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.824643 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lplhk\" (UniqueName: \"kubernetes.io/projected/ce71ec75-9d46-43ff-a08e-430ef60a6d9e-kube-api-access-lplhk\") pod \"ovnkube-control-plane-749d76644c-smc4j\" (UID: \"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.827808 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:47Z\\\",\\\"message\\\":\\\"r.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 06:45:47.687487 6107 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 06:45:47.687498 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 06:45:47.687536 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 06:45:47.687569 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 06:45:47.687568 6107 factory.go:656] Stopping watch factory\\\\nI1006 06:45:47.687591 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 06:45:47.687322 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 06:45:47.687789 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 06:45:47.687612 6107 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 06:45:47.687623 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 06:45:47.687898 6107 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 06:45:47.687989 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:49Z\\\",\\\"message\\\":\\\" calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z]\\\\nI1006 06:45:49.458874 6301 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-8bzqb\\\\nI1006 06:45:49.458883 6301 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-8bzqb in node crc\\\\nI1006 06:45:49.458820 6301 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.844146 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.867622 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3dee41930c2c07895dacced11ecfc9ce1cddfd85fc8f845088825ad28c618f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:47Z\\\",\\\"message\\\":\\\"r.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 06:45:47.687487 6107 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 06:45:47.687498 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 06:45:47.687536 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 06:45:47.687569 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 06:45:47.687568 6107 factory.go:656] Stopping watch factory\\\\nI1006 06:45:47.687591 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 06:45:47.687322 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 06:45:47.687789 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 06:45:47.687612 6107 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 06:45:47.687623 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 06:45:47.687898 6107 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 06:45:47.687989 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:49Z\\\",\\\"message\\\":\\\" calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z]\\\\nI1006 06:45:49.458874 6301 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-8bzqb\\\\nI1006 06:45:49.458883 6301 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-8bzqb in node crc\\\\nI1006 06:45:49.458820 6301 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.881741 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.896289 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.905747 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.905801 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.905818 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.905841 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.905857 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:50Z","lastTransitionTime":"2025-10-06T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.909505 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.923005 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.937003 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.950739 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.961923 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.981119 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:50 crc kubenswrapper[4845]: I1006 06:45:50.998075 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:50Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.000358 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.010140 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.010193 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.010206 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.010227 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.010243 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:51Z","lastTransitionTime":"2025-10-06T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.017657 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: W1006 06:45:51.032791 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce71ec75_9d46_43ff_a08e_430ef60a6d9e.slice/crio-51f4e2965a0d3044dc0e9ab4301f358f456e9993da2455aa37efb1b2adf7e698 WatchSource:0}: Error finding container 51f4e2965a0d3044dc0e9ab4301f358f456e9993da2455aa37efb1b2adf7e698: Status 404 returned error can't find the container with id 51f4e2965a0d3044dc0e9ab4301f358f456e9993da2455aa37efb1b2adf7e698 Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.036341 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.054923 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.073576 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.112991 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.113032 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.113046 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.113070 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.113085 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:51Z","lastTransitionTime":"2025-10-06T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.216270 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.216321 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.216335 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.216355 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.216369 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:51Z","lastTransitionTime":"2025-10-06T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.318211 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.318258 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.318271 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.318291 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.318305 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:51Z","lastTransitionTime":"2025-10-06T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.421780 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.421877 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.421891 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.421910 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.421944 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:51Z","lastTransitionTime":"2025-10-06T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.524137 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.524177 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.524191 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.524209 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.524219 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:51Z","lastTransitionTime":"2025-10-06T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.528340 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/1.log" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.531484 4845 scope.go:117] "RemoveContainer" containerID="37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b" Oct 06 06:45:51 crc kubenswrapper[4845]: E1006 06:45:51.531913 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.533418 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" event={"ID":"ce71ec75-9d46-43ff-a08e-430ef60a6d9e","Type":"ContainerStarted","Data":"e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.533458 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" event={"ID":"ce71ec75-9d46-43ff-a08e-430ef60a6d9e","Type":"ContainerStarted","Data":"e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.533476 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" event={"ID":"ce71ec75-9d46-43ff-a08e-430ef60a6d9e","Type":"ContainerStarted","Data":"51f4e2965a0d3044dc0e9ab4301f358f456e9993da2455aa37efb1b2adf7e698"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.544879 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.553557 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.566163 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.576727 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.587912 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.603309 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:49Z\\\",\\\"message\\\":\\\" calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z]\\\\nI1006 06:45:49.458874 6301 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-8bzqb\\\\nI1006 06:45:49.458883 6301 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-8bzqb in node crc\\\\nI1006 06:45:49.458820 6301 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.614638 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.627442 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.627486 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.627499 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.627517 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.627531 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:51Z","lastTransitionTime":"2025-10-06T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.628047 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.637645 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.654030 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.667477 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.678971 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.690055 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.701763 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.713721 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.726465 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.730322 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.730387 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.730403 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.730424 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.730441 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:51Z","lastTransitionTime":"2025-10-06T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.743266 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.755518 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.772312 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.788641 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.816033 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:49Z\\\",\\\"message\\\":\\\" calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z]\\\\nI1006 06:45:49.458874 6301 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-8bzqb\\\\nI1006 06:45:49.458883 6301 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-8bzqb in node crc\\\\nI1006 06:45:49.458820 6301 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.830256 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.833247 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.833320 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.833356 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.833413 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.833429 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:51Z","lastTransitionTime":"2025-10-06T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.849403 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.860979 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.880216 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.896347 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.911908 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.927808 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.939859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.939899 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.939909 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.939925 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.939935 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:51Z","lastTransitionTime":"2025-10-06T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.953725 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:51 crc kubenswrapper[4845]: I1006 06:45:51.969995 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:51Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.042786 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.042842 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.042854 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.042873 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.042886 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:52Z","lastTransitionTime":"2025-10-06T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.146424 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.146481 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.146493 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.146512 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.146523 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:52Z","lastTransitionTime":"2025-10-06T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.193644 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4l7qj"] Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.194525 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.194593 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.212528 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.212800 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.212916 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.213025 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.213132 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.213304 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6s2d\" (UniqueName: \"kubernetes.io/projected/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-kube-api-access-h6s2d\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.213468 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.213650 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.213680 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.213693 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.213746 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 06:46:08.213731242 +0000 UTC m=+52.728472250 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.213767 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:46:08.213759013 +0000 UTC m=+52.728500021 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.213774 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.213810 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.213877 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.213995 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 06:46:08.213969308 +0000 UTC m=+52.728710396 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.213657 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.213779 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.214166 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:46:08.214103611 +0000 UTC m=+52.728844749 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.214004 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.214204 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:46:08.214187703 +0000 UTC m=+52.728928851 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.226463 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.226744 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.226798 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.226839 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.226956 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.227010 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.229111 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.249633 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.249665 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.249677 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.249695 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.249707 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:52Z","lastTransitionTime":"2025-10-06T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.256243 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.292098 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.312362 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.313985 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6s2d\" (UniqueName: \"kubernetes.io/projected/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-kube-api-access-h6s2d\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.314059 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.314187 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.314238 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs podName:f80a2f04-a041-4acb-ace9-c0e40aed5f6d nodeName:}" failed. No retries permitted until 2025-10-06 06:45:52.814222122 +0000 UTC m=+37.328963130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs") pod "network-metrics-daemon-4l7qj" (UID: "f80a2f04-a041-4acb-ace9-c0e40aed5f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.336422 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6s2d\" (UniqueName: \"kubernetes.io/projected/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-kube-api-access-h6s2d\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.336620 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.352832 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.352859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.352868 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.352884 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.352894 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:52Z","lastTransitionTime":"2025-10-06T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.357651 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.375218 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.386465 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.396276 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.406393 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.414667 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.427737 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.444096 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:49Z\\\",\\\"message\\\":\\\" calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z]\\\\nI1006 06:45:49.458874 6301 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-8bzqb\\\\nI1006 06:45:49.458883 6301 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-8bzqb in node crc\\\\nI1006 06:45:49.458820 6301 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.455258 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.455318 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.455242 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.455327 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.455467 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.455481 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:52Z","lastTransitionTime":"2025-10-06T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.469524 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:52Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.557893 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.557936 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.557946 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.557976 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.557988 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:52Z","lastTransitionTime":"2025-10-06T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.660950 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.661015 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.661028 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.661047 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.661060 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:52Z","lastTransitionTime":"2025-10-06T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.763589 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.763670 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.763689 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.763706 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.763723 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:52Z","lastTransitionTime":"2025-10-06T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.818878 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.819056 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:45:52 crc kubenswrapper[4845]: E1006 06:45:52.819146 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs podName:f80a2f04-a041-4acb-ace9-c0e40aed5f6d nodeName:}" failed. No retries permitted until 2025-10-06 06:45:53.819122999 +0000 UTC m=+38.333864007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs") pod "network-metrics-daemon-4l7qj" (UID: "f80a2f04-a041-4acb-ace9-c0e40aed5f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.878606 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.878655 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.878671 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.878692 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.878705 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:52Z","lastTransitionTime":"2025-10-06T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.981614 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.981676 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.981696 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.981716 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:52 crc kubenswrapper[4845]: I1006 06:45:52.981728 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:52Z","lastTransitionTime":"2025-10-06T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.078343 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.078496 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.078525 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.078558 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.078578 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: E1006 06:45:53.095615 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:53Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.104976 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.105041 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.105062 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.105089 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.105108 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: E1006 06:45:53.123656 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:53Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.128021 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.128078 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.128087 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.128216 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.128229 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: E1006 06:45:53.139656 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:53Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.144211 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.144292 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.144306 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.144332 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.144348 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: E1006 06:45:53.160256 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:53Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.163907 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.163951 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.163961 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.163979 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.163990 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: E1006 06:45:53.177230 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:53Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:53 crc kubenswrapper[4845]: E1006 06:45:53.177388 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.179125 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.179158 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.179167 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.179183 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.179193 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.281795 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.281839 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.281850 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.281869 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.281881 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.384522 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.384579 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.384595 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.384616 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.384629 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.487796 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.487859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.487876 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.487907 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.487932 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.591943 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.592007 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.592028 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.592062 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.592085 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.695353 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.695442 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.695464 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.695511 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.695536 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.802003 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.802104 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.802131 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.802170 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.802195 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.829610 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:53 crc kubenswrapper[4845]: E1006 06:45:53.829818 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:45:53 crc kubenswrapper[4845]: E1006 06:45:53.829949 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs podName:f80a2f04-a041-4acb-ace9-c0e40aed5f6d nodeName:}" failed. No retries permitted until 2025-10-06 06:45:55.829924056 +0000 UTC m=+40.344665074 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs") pod "network-metrics-daemon-4l7qj" (UID: "f80a2f04-a041-4acb-ace9-c0e40aed5f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.904904 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.904981 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.905001 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.905032 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:53 crc kubenswrapper[4845]: I1006 06:45:53.905054 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:53Z","lastTransitionTime":"2025-10-06T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.007649 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.007706 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.007756 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.007786 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.007803 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:54Z","lastTransitionTime":"2025-10-06T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.110683 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.110737 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.110754 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.110776 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.110792 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:54Z","lastTransitionTime":"2025-10-06T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.212506 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.212546 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.212555 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.212571 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.212580 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:54Z","lastTransitionTime":"2025-10-06T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.226019 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.226081 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.226015 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.226023 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:54 crc kubenswrapper[4845]: E1006 06:45:54.226163 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:45:54 crc kubenswrapper[4845]: E1006 06:45:54.226194 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:45:54 crc kubenswrapper[4845]: E1006 06:45:54.226272 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:45:54 crc kubenswrapper[4845]: E1006 06:45:54.226339 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.315043 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.315079 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.315087 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.315103 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.315119 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:54Z","lastTransitionTime":"2025-10-06T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.417832 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.417888 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.417897 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.417914 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.417925 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:54Z","lastTransitionTime":"2025-10-06T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.520505 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.520555 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.520565 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.520584 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.520595 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:54Z","lastTransitionTime":"2025-10-06T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.623221 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.623269 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.623283 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.623310 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.623322 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:54Z","lastTransitionTime":"2025-10-06T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.725653 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.725698 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.725709 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.725724 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.725735 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:54Z","lastTransitionTime":"2025-10-06T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.827938 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.827981 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.827992 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.828010 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.828021 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:54Z","lastTransitionTime":"2025-10-06T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.930295 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.930334 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.930347 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.930366 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:54 crc kubenswrapper[4845]: I1006 06:45:54.930396 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:54Z","lastTransitionTime":"2025-10-06T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.033268 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.033314 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.033329 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.033355 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.033399 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:55Z","lastTransitionTime":"2025-10-06T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.136144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.136199 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.136208 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.136226 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.136238 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:55Z","lastTransitionTime":"2025-10-06T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.227187 4845 scope.go:117] "RemoveContainer" containerID="526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.238655 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.238684 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.238696 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.238715 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.238726 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:55Z","lastTransitionTime":"2025-10-06T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.341414 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.341463 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.341474 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.341491 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.341502 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:55Z","lastTransitionTime":"2025-10-06T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.444258 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.444325 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.444349 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.444406 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.444431 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:55Z","lastTransitionTime":"2025-10-06T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.547144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.547211 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.547236 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.547267 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.547288 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:55Z","lastTransitionTime":"2025-10-06T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.550365 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.552195 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1"} Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.552965 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.569426 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.584525 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.595238 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.605774 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.619774 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.632956 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.646322 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.651220 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.651259 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.651275 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.651295 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.651309 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:55Z","lastTransitionTime":"2025-10-06T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.659343 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.673855 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.685563 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.697643 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.713623 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.725601 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.739942 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.753541 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.753582 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.753594 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.753614 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.753629 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:55Z","lastTransitionTime":"2025-10-06T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.765408 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:49Z\\\",\\\"message\\\":\\\" calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z]\\\\nI1006 06:45:49.458874 6301 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-8bzqb\\\\nI1006 06:45:49.458883 6301 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-8bzqb in node crc\\\\nI1006 06:45:49.458820 6301 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.777033 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:55Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.849388 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:55 crc kubenswrapper[4845]: E1006 06:45:55.849549 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:45:55 crc kubenswrapper[4845]: E1006 06:45:55.849611 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs podName:f80a2f04-a041-4acb-ace9-c0e40aed5f6d nodeName:}" failed. No retries permitted until 2025-10-06 06:45:59.849593254 +0000 UTC m=+44.364334262 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs") pod "network-metrics-daemon-4l7qj" (UID: "f80a2f04-a041-4acb-ace9-c0e40aed5f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.855856 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.855904 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.855919 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.855940 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.855953 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:55Z","lastTransitionTime":"2025-10-06T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.958224 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.958273 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.958287 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.958308 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:55 crc kubenswrapper[4845]: I1006 06:45:55.958321 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:55Z","lastTransitionTime":"2025-10-06T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.060252 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.060298 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.060309 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.060327 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.060339 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:56Z","lastTransitionTime":"2025-10-06T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.162991 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.163040 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.163050 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.163067 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.163077 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:56Z","lastTransitionTime":"2025-10-06T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.225990 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.226025 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.226051 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.226085 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:56 crc kubenswrapper[4845]: E1006 06:45:56.226146 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:45:56 crc kubenswrapper[4845]: E1006 06:45:56.226213 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:45:56 crc kubenswrapper[4845]: E1006 06:45:56.226264 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:45:56 crc kubenswrapper[4845]: E1006 06:45:56.226489 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.240026 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.260141 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:49Z\\\",\\\"message\\\":\\\" calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z]\\\\nI1006 06:45:49.458874 6301 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-8bzqb\\\\nI1006 06:45:49.458883 6301 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-8bzqb in node crc\\\\nI1006 06:45:49.458820 6301 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.265235 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.265282 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.265292 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.265314 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.265326 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:56Z","lastTransitionTime":"2025-10-06T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.276434 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.288228 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.304636 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.323568 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.344010 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.359460 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.367437 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.367478 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.367489 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.367507 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.367521 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:56Z","lastTransitionTime":"2025-10-06T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.380067 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.399464 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.421699 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.435581 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.447436 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.458164 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.470366 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.470421 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.470430 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.470455 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.470466 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:56Z","lastTransitionTime":"2025-10-06T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.472905 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.490609 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:56Z is after 2025-08-24T17:21:41Z" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.572969 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.573013 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.573022 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.573038 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.573048 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:56Z","lastTransitionTime":"2025-10-06T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.674967 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.675010 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.675019 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.675035 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.675046 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:56Z","lastTransitionTime":"2025-10-06T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.777886 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.777943 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.777952 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.777970 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.777979 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:56Z","lastTransitionTime":"2025-10-06T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.881815 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.881864 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.881876 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.881900 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.881941 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:56Z","lastTransitionTime":"2025-10-06T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.984086 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.984122 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.984131 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.984146 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:56 crc kubenswrapper[4845]: I1006 06:45:56.984158 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:56Z","lastTransitionTime":"2025-10-06T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.086423 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.086475 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.086487 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.086501 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.086512 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:57Z","lastTransitionTime":"2025-10-06T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.189267 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.189312 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.189322 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.189340 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.189352 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:57Z","lastTransitionTime":"2025-10-06T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.292199 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.292250 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.292260 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.292275 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.292284 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:57Z","lastTransitionTime":"2025-10-06T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.395861 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.395939 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.396011 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.396053 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.396074 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:57Z","lastTransitionTime":"2025-10-06T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.504445 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.504494 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.504503 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.504519 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.504529 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:57Z","lastTransitionTime":"2025-10-06T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.608086 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.608142 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.608154 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.608175 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.608187 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:57Z","lastTransitionTime":"2025-10-06T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.711311 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.711363 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.711390 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.711412 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.711423 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:57Z","lastTransitionTime":"2025-10-06T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.814166 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.814219 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.814231 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.814249 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.814260 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:57Z","lastTransitionTime":"2025-10-06T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.919810 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.920122 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.920134 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.920155 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:57 crc kubenswrapper[4845]: I1006 06:45:57.920167 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:57Z","lastTransitionTime":"2025-10-06T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.022619 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.022685 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.022701 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.022726 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.022744 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:58Z","lastTransitionTime":"2025-10-06T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.126613 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.126672 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.126683 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.126701 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.126713 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:58Z","lastTransitionTime":"2025-10-06T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.226461 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.226495 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.226640 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.226814 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:45:58 crc kubenswrapper[4845]: E1006 06:45:58.227275 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:45:58 crc kubenswrapper[4845]: E1006 06:45:58.227325 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:45:58 crc kubenswrapper[4845]: E1006 06:45:58.227344 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:45:58 crc kubenswrapper[4845]: E1006 06:45:58.227490 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.230551 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.230581 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.230590 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.230605 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.230617 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:58Z","lastTransitionTime":"2025-10-06T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.333083 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.333120 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.333129 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.333143 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.333153 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:58Z","lastTransitionTime":"2025-10-06T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.435999 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.436070 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.436082 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.436098 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.436107 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:58Z","lastTransitionTime":"2025-10-06T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.538970 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.539016 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.539028 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.539044 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.539055 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:58Z","lastTransitionTime":"2025-10-06T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.641826 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.641891 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.641907 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.641929 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.641943 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:58Z","lastTransitionTime":"2025-10-06T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.748732 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.748783 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.748794 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.748810 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.748820 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:58Z","lastTransitionTime":"2025-10-06T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.850905 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.850967 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.850985 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.851047 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.851066 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:58Z","lastTransitionTime":"2025-10-06T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.953318 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.953366 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.953402 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.953421 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:58 crc kubenswrapper[4845]: I1006 06:45:58.953434 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:58Z","lastTransitionTime":"2025-10-06T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.056297 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.056336 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.056344 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.056359 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.056369 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:59Z","lastTransitionTime":"2025-10-06T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.158450 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.158489 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.158499 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.158515 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.158525 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:59Z","lastTransitionTime":"2025-10-06T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.261406 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.261452 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.261479 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.261494 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.261509 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:59Z","lastTransitionTime":"2025-10-06T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.364230 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.364288 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.364304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.364327 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.364343 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:59Z","lastTransitionTime":"2025-10-06T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.467396 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.467440 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.467449 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.467465 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.467475 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:59Z","lastTransitionTime":"2025-10-06T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.570935 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.570980 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.570994 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.571012 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.571025 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:59Z","lastTransitionTime":"2025-10-06T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.674994 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.675074 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.675099 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.675135 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.675160 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:59Z","lastTransitionTime":"2025-10-06T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.778167 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.778242 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.778260 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.778303 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.778319 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:59Z","lastTransitionTime":"2025-10-06T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.881306 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.881360 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.881397 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.881423 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.881434 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:59Z","lastTransitionTime":"2025-10-06T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.897070 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:45:59 crc kubenswrapper[4845]: E1006 06:45:59.897256 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:45:59 crc kubenswrapper[4845]: E1006 06:45:59.897368 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs podName:f80a2f04-a041-4acb-ace9-c0e40aed5f6d nodeName:}" failed. No retries permitted until 2025-10-06 06:46:07.897344276 +0000 UTC m=+52.412085294 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs") pod "network-metrics-daemon-4l7qj" (UID: "f80a2f04-a041-4acb-ace9-c0e40aed5f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.984326 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.984447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.984470 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.984503 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:45:59 crc kubenswrapper[4845]: I1006 06:45:59.984529 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:45:59Z","lastTransitionTime":"2025-10-06T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.088139 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.088226 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.088251 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.088282 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.088314 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:00Z","lastTransitionTime":"2025-10-06T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.191251 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.191283 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.191292 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.191305 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.191314 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:00Z","lastTransitionTime":"2025-10-06T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.225787 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.225799 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:00 crc kubenswrapper[4845]: E1006 06:46:00.225884 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.225936 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.226039 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:00 crc kubenswrapper[4845]: E1006 06:46:00.226107 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:00 crc kubenswrapper[4845]: E1006 06:46:00.226277 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:00 crc kubenswrapper[4845]: E1006 06:46:00.226426 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.294506 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.294557 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.294572 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.294592 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.294604 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:00Z","lastTransitionTime":"2025-10-06T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.398043 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.398078 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.398087 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.398102 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.398113 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:00Z","lastTransitionTime":"2025-10-06T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.500333 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.500441 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.500463 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.500498 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.500520 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:00Z","lastTransitionTime":"2025-10-06T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.603169 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.603233 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.603251 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.603278 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.603299 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:00Z","lastTransitionTime":"2025-10-06T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.706772 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.706883 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.706903 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.706929 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.706947 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:00Z","lastTransitionTime":"2025-10-06T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.808885 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.808940 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.808951 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.808967 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.808978 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:00Z","lastTransitionTime":"2025-10-06T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.912077 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.912144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.912153 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.912172 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:00 crc kubenswrapper[4845]: I1006 06:46:00.912182 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:00Z","lastTransitionTime":"2025-10-06T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.021686 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.021780 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.021805 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.021839 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.021865 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:01Z","lastTransitionTime":"2025-10-06T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.125218 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.125283 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.125300 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.125327 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.125344 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:01Z","lastTransitionTime":"2025-10-06T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.227888 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.227947 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.227965 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.227989 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.228008 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:01Z","lastTransitionTime":"2025-10-06T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.331199 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.331363 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.331405 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.331433 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.331453 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:01Z","lastTransitionTime":"2025-10-06T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.434778 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.434830 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.434847 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.434871 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.434889 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:01Z","lastTransitionTime":"2025-10-06T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.537935 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.537994 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.538011 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.538038 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.538056 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:01Z","lastTransitionTime":"2025-10-06T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.641227 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.641331 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.641350 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.641413 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.641432 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:01Z","lastTransitionTime":"2025-10-06T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.744308 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.744368 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.744437 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.744468 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.744485 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:01Z","lastTransitionTime":"2025-10-06T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.848113 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.848173 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.848190 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.848216 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.848236 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:01Z","lastTransitionTime":"2025-10-06T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.951491 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.951552 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.951570 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.951596 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:01 crc kubenswrapper[4845]: I1006 06:46:01.951617 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:01Z","lastTransitionTime":"2025-10-06T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.055066 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.055110 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.055124 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.055145 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.055157 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:02Z","lastTransitionTime":"2025-10-06T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.157920 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.157969 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.157989 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.158008 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.158019 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:02Z","lastTransitionTime":"2025-10-06T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.226437 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.226561 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.226454 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:02 crc kubenswrapper[4845]: E1006 06:46:02.226737 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.226773 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:02 crc kubenswrapper[4845]: E1006 06:46:02.227093 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:02 crc kubenswrapper[4845]: E1006 06:46:02.227328 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:02 crc kubenswrapper[4845]: E1006 06:46:02.228357 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.261491 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.261536 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.261544 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.261560 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.261570 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:02Z","lastTransitionTime":"2025-10-06T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.364217 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.364277 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.364295 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.364321 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.364341 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:02Z","lastTransitionTime":"2025-10-06T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.466790 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.466842 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.466866 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.466889 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.466904 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:02Z","lastTransitionTime":"2025-10-06T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.569484 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.569522 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.569532 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.569548 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.569559 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:02Z","lastTransitionTime":"2025-10-06T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.672471 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.672543 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.672565 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.672595 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.672619 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:02Z","lastTransitionTime":"2025-10-06T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.774722 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.774785 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.774803 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.774827 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.774845 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:02Z","lastTransitionTime":"2025-10-06T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.877078 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.877149 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.877173 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.877204 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.877225 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:02Z","lastTransitionTime":"2025-10-06T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.979989 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.980053 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.980072 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.980096 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:02 crc kubenswrapper[4845]: I1006 06:46:02.980116 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:02Z","lastTransitionTime":"2025-10-06T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.083600 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.083651 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.083662 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.083681 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.083696 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.186733 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.186799 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.186817 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.186843 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.186862 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.289671 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.289737 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.289761 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.289790 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.289811 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.341943 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.341999 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.342009 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.342026 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.342038 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: E1006 06:46:03.359074 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:03Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.363703 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.363751 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.363767 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.363792 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.363810 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: E1006 06:46:03.389309 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:03Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.394243 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.394305 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.394324 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.394349 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.394367 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: E1006 06:46:03.418317 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:03Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.422658 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.422712 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.422731 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.422756 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.422775 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: E1006 06:46:03.444424 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:03Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.448900 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.448938 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.448952 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.448971 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.448986 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: E1006 06:46:03.461873 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:03Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:03 crc kubenswrapper[4845]: E1006 06:46:03.462095 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.464041 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.464158 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.464179 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.464202 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.464220 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.568110 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.568157 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.568169 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.568187 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.568200 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.671224 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.671289 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.671306 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.671331 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.671352 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.773221 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.773273 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.773287 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.773308 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.773323 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.875792 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.875865 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.875922 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.875958 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.875984 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.978939 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.978982 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.978992 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.979008 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:03 crc kubenswrapper[4845]: I1006 06:46:03.979018 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:03Z","lastTransitionTime":"2025-10-06T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.081198 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.081234 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.081243 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.081272 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.081283 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:04Z","lastTransitionTime":"2025-10-06T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.184170 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.184218 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.184234 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.184260 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.184278 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:04Z","lastTransitionTime":"2025-10-06T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.226246 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.226246 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:04 crc kubenswrapper[4845]: E1006 06:46:04.226436 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.226602 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.226630 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:04 crc kubenswrapper[4845]: E1006 06:46:04.226687 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:04 crc kubenswrapper[4845]: E1006 06:46:04.226586 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.227306 4845 scope.go:117] "RemoveContainer" containerID="37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b" Oct 06 06:46:04 crc kubenswrapper[4845]: E1006 06:46:04.227502 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.286329 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.286561 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.286637 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.286716 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.286875 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:04Z","lastTransitionTime":"2025-10-06T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.391089 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.391143 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.391165 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.391200 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.391490 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:04Z","lastTransitionTime":"2025-10-06T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.495252 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.495299 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.495307 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.495323 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.495332 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:04Z","lastTransitionTime":"2025-10-06T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.584723 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/1.log" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.587309 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c"} Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.587686 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.598099 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.598129 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.598139 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.598152 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.598162 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:04Z","lastTransitionTime":"2025-10-06T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.604850 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.625360 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.652006 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.667306 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.680620 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.691783 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.699869 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.699899 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.699909 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.699924 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.699933 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:04Z","lastTransitionTime":"2025-10-06T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.704677 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.716612 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.731393 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.741453 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.751319 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.765916 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.785336 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:49Z\\\",\\\"message\\\":\\\" calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z]\\\\nI1006 06:45:49.458874 6301 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-8bzqb\\\\nI1006 06:45:49.458883 6301 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-8bzqb in node crc\\\\nI1006 06:45:49.458820 6301 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.798116 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.801704 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.801727 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.801738 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.801755 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.801764 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:04Z","lastTransitionTime":"2025-10-06T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.820227 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.831074 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:04Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.904616 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.904653 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.904666 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.904683 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:04 crc kubenswrapper[4845]: I1006 06:46:04.904695 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:04Z","lastTransitionTime":"2025-10-06T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.007414 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.007450 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.007462 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.007479 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.007492 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:05Z","lastTransitionTime":"2025-10-06T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.109906 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.109942 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.109953 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.109970 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.109980 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:05Z","lastTransitionTime":"2025-10-06T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.212218 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.212268 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.212282 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.212304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.212330 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:05Z","lastTransitionTime":"2025-10-06T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.314792 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.314838 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.314849 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.314865 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.314876 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:05Z","lastTransitionTime":"2025-10-06T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.418340 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.418412 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.418424 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.418442 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.418451 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:05Z","lastTransitionTime":"2025-10-06T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.521899 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.521942 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.521950 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.521966 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.521975 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:05Z","lastTransitionTime":"2025-10-06T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.593829 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/2.log" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.594950 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/1.log" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.599194 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c" exitCode=1 Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.599254 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c"} Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.599306 4845 scope.go:117] "RemoveContainer" containerID="37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.600555 4845 scope.go:117] "RemoveContainer" containerID="b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c" Oct 06 06:46:05 crc kubenswrapper[4845]: E1006 06:46:05.600813 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.618584 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.624152 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.624634 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.624653 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.624680 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.624699 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:05Z","lastTransitionTime":"2025-10-06T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.634026 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.643274 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.656248 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.668529 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.677564 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.687489 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.698684 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.726768 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.726807 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.726820 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.726834 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.726843 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:05Z","lastTransitionTime":"2025-10-06T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.728981 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.752646 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.760777 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.772202 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.781341 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.790986 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.805962 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:49Z\\\",\\\"message\\\":\\\" calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z]\\\\nI1006 06:45:49.458874 6301 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-8bzqb\\\\nI1006 06:45:49.458883 6301 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-8bzqb in node crc\\\\nI1006 06:45:49.458820 6301 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:05Z\\\",\\\"message\\\":\\\"4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 06:46:05.017663 6538 services_controller.go:434] Service openshift-machine-api/control-plane-machine-set-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{control-plane-machine-set-operator openshift-machine-api ffd0ef27-d28d-43cc-90c8-0e8843e4c04c 4409 0 2025-02-23 05:12:21 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:control-plane-machine-set-operator] map[capability.openshift.io/name:MachineAPI exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:control-plane-machine-set-operator-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075e04d7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: control-plane-machine-set-operator,},ClusterIP:10.217.4.41,Type:ClusterIP,ExternalIP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.815609 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:05Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.833162 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.833208 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.833217 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.833235 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.833247 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:05Z","lastTransitionTime":"2025-10-06T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.935522 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.935597 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.935614 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.935638 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:05 crc kubenswrapper[4845]: I1006 06:46:05.935656 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:05Z","lastTransitionTime":"2025-10-06T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.038261 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.038296 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.038306 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.038319 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.038329 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:06Z","lastTransitionTime":"2025-10-06T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.141340 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.141445 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.141469 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.141496 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.141514 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:06Z","lastTransitionTime":"2025-10-06T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.203192 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.212094 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.219294 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.231312 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.246603 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.246653 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.246672 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.246695 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.246712 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:06Z","lastTransitionTime":"2025-10-06T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.248353 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.248370 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:06 crc kubenswrapper[4845]: E1006 06:46:06.248500 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.248540 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.248583 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:06 crc kubenswrapper[4845]: E1006 06:46:06.248662 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:06 crc kubenswrapper[4845]: E1006 06:46:06.248721 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:06 crc kubenswrapper[4845]: E1006 06:46:06.248788 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.249199 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.265433 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.275487 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.290233 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.299843 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.310789 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.332838 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.344104 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.349285 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.349311 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.349319 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.349333 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.349341 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:06Z","lastTransitionTime":"2025-10-06T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.355675 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.366998 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.396416 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:49Z\\\",\\\"message\\\":\\\" calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z]\\\\nI1006 06:45:49.458874 6301 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-8bzqb\\\\nI1006 06:45:49.458883 6301 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-8bzqb in node crc\\\\nI1006 06:45:49.458820 6301 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:05Z\\\",\\\"message\\\":\\\"4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 06:46:05.017663 6538 services_controller.go:434] Service openshift-machine-api/control-plane-machine-set-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{control-plane-machine-set-operator openshift-machine-api ffd0ef27-d28d-43cc-90c8-0e8843e4c04c 4409 0 2025-02-23 05:12:21 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:control-plane-machine-set-operator] map[capability.openshift.io/name:MachineAPI exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:control-plane-machine-set-operator-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075e04d7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: control-plane-machine-set-operator,},ClusterIP:10.217.4.41,Type:ClusterIP,ExternalIP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.407550 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.420023 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.429649 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.443735 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.451920 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.451978 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.452001 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.452033 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.452056 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:06Z","lastTransitionTime":"2025-10-06T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.461351 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.473840 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.485349 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.497386 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.508576 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.518590 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.533266 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.543028 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.555697 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.555803 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.556407 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.556449 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.556467 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.556479 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:06Z","lastTransitionTime":"2025-10-06T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.565854 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.581825 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.592793 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.602171 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff96c219-4289-4880-a8ab-ed6da7557dcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec9a2dc2d359f53f2165220af84f6ee28bfef0db39b86b12ebf600727e32e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b87b0a15c19a70cdb61338f704315d219bd382c5ced04b22851ef71a71464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e270831b0c56a937ba5cd52f2367bce7b5e8a2837c035aa4e112439829cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.605538 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/2.log" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.609547 4845 scope.go:117] "RemoveContainer" containerID="b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c" Oct 06 06:46:06 crc kubenswrapper[4845]: E1006 06:46:06.609736 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.616550 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.636763 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37c6870709a97b8a5325ef2ffc44e7a455afcc2a4ed62ade372671f0c356dd6b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:45:49Z\\\",\\\"message\\\":\\\" calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:45:49Z is after 2025-08-24T17:21:41Z]\\\\nI1006 06:45:49.458874 6301 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-8bzqb\\\\nI1006 06:45:49.458883 6301 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-8bzqb in node crc\\\\nI1006 06:45:49.458820 6301 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_nod\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:05Z\\\",\\\"message\\\":\\\"4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 06:46:05.017663 6538 services_controller.go:434] Service openshift-machine-api/control-plane-machine-set-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{control-plane-machine-set-operator openshift-machine-api ffd0ef27-d28d-43cc-90c8-0e8843e4c04c 4409 0 2025-02-23 05:12:21 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:control-plane-machine-set-operator] map[capability.openshift.io/name:MachineAPI exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:control-plane-machine-set-operator-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075e04d7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: control-plane-machine-set-operator,},ClusterIP:10.217.4.41,Type:ClusterIP,ExternalIP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.645689 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.654574 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.658789 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.658853 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.658868 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.658886 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.658898 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:06Z","lastTransitionTime":"2025-10-06T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.667884 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.678309 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.689781 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.701753 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff96c219-4289-4880-a8ab-ed6da7557dcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec9a2dc2d359f53f2165220af84f6ee28bfef0db39b86b12ebf600727e32e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b87b0a15c19a70cdb61338f704315d219bd382c5ced04b22851ef71a71464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e270831b0c56a937ba5cd52f2367bce7b5e8a2837c035aa4e112439829cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.715709 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.735747 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:05Z\\\",\\\"message\\\":\\\"4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 06:46:05.017663 6538 services_controller.go:434] Service openshift-machine-api/control-plane-machine-set-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{control-plane-machine-set-operator openshift-machine-api ffd0ef27-d28d-43cc-90c8-0e8843e4c04c 4409 0 2025-02-23 05:12:21 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:control-plane-machine-set-operator] map[capability.openshift.io/name:MachineAPI exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:control-plane-machine-set-operator-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075e04d7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: control-plane-machine-set-operator,},ClusterIP:10.217.4.41,Type:ClusterIP,ExternalIP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.746213 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.754536 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.760934 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.761033 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.761051 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.761111 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.761127 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:06Z","lastTransitionTime":"2025-10-06T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.765872 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.778973 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.790792 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.800328 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.811282 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.820939 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.830793 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.841650 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:06Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.863594 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.863637 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.863646 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.863663 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.863684 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:06Z","lastTransitionTime":"2025-10-06T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.966493 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.966541 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.966579 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.966596 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:06 crc kubenswrapper[4845]: I1006 06:46:06.966610 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:06Z","lastTransitionTime":"2025-10-06T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.068712 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.068765 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.068783 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.068806 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.068816 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:07Z","lastTransitionTime":"2025-10-06T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.172171 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.172252 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.172267 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.172296 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.172313 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:07Z","lastTransitionTime":"2025-10-06T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.276411 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.276459 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.276472 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.276492 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.276503 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:07Z","lastTransitionTime":"2025-10-06T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.379858 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.379915 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.379934 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.379958 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.379976 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:07Z","lastTransitionTime":"2025-10-06T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.482928 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.482988 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.483007 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.483034 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.483053 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:07Z","lastTransitionTime":"2025-10-06T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.586408 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.586493 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.586513 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.586545 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.586566 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:07Z","lastTransitionTime":"2025-10-06T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.689839 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.689906 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.689923 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.689948 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.689965 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:07Z","lastTransitionTime":"2025-10-06T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.793953 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.794011 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.794029 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.794056 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.794074 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:07Z","lastTransitionTime":"2025-10-06T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.897946 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.897992 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.898003 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.898024 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.898038 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:07Z","lastTransitionTime":"2025-10-06T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:07 crc kubenswrapper[4845]: I1006 06:46:07.981035 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:07 crc kubenswrapper[4845]: E1006 06:46:07.981294 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:46:07 crc kubenswrapper[4845]: E1006 06:46:07.981442 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs podName:f80a2f04-a041-4acb-ace9-c0e40aed5f6d nodeName:}" failed. No retries permitted until 2025-10-06 06:46:23.981414907 +0000 UTC m=+68.496155945 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs") pod "network-metrics-daemon-4l7qj" (UID: "f80a2f04-a041-4acb-ace9-c0e40aed5f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.001788 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.002004 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.002031 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.002050 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.002063 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:08Z","lastTransitionTime":"2025-10-06T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.105958 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.106037 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.106058 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.106084 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.106103 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:08Z","lastTransitionTime":"2025-10-06T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.209366 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.209447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.209464 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.209489 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.209510 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:08Z","lastTransitionTime":"2025-10-06T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.227085 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.227198 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.227275 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.227309 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.227447 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.227578 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.227659 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.227713 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.285922 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.286087 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:46:40.28605389 +0000 UTC m=+84.800794928 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.286230 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.286331 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.286450 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.286476 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.286490 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.286495 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.286462 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.286550 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:46:40.286534531 +0000 UTC m=+84.801275609 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.286585 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.286616 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.286660 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:46:40.286649854 +0000 UTC m=+84.801390942 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.286790 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.286834 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.286847 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.288564 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 06:46:40.286671854 +0000 UTC m=+84.801412942 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:46:08 crc kubenswrapper[4845]: E1006 06:46:08.288629 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 06:46:40.288607599 +0000 UTC m=+84.803348667 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.311502 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.311545 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.311557 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.311576 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.311590 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:08Z","lastTransitionTime":"2025-10-06T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.415777 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.415833 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.415850 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.415875 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.415893 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:08Z","lastTransitionTime":"2025-10-06T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.518426 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.518463 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.518473 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.518488 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.518497 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:08Z","lastTransitionTime":"2025-10-06T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.619910 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.619942 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.619951 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.619962 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.619971 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:08Z","lastTransitionTime":"2025-10-06T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.722276 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.722315 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.722323 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.722337 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.722346 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:08Z","lastTransitionTime":"2025-10-06T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.824641 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.824785 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.824870 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.824948 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.825010 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:08Z","lastTransitionTime":"2025-10-06T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.927131 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.927365 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.927464 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.927603 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:08 crc kubenswrapper[4845]: I1006 06:46:08.927671 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:08Z","lastTransitionTime":"2025-10-06T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.029567 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.029599 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.029608 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.029622 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.029631 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:09Z","lastTransitionTime":"2025-10-06T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.131967 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.132035 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.132052 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.132078 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.132095 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:09Z","lastTransitionTime":"2025-10-06T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.234877 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.234935 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.234945 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.234964 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.234975 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:09Z","lastTransitionTime":"2025-10-06T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.337657 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.337696 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.337706 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.337721 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.337746 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:09Z","lastTransitionTime":"2025-10-06T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.440147 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.440182 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.440194 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.440210 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.440220 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:09Z","lastTransitionTime":"2025-10-06T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.542091 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.542154 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.542167 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.542181 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.542189 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:09Z","lastTransitionTime":"2025-10-06T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.644199 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.644232 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.644243 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.644257 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.644268 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:09Z","lastTransitionTime":"2025-10-06T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.746421 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.746479 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.746491 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.746506 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.746520 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:09Z","lastTransitionTime":"2025-10-06T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.848900 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.848934 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.848963 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.848978 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.848988 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:09Z","lastTransitionTime":"2025-10-06T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.951736 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.951799 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.951823 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.951855 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:09 crc kubenswrapper[4845]: I1006 06:46:09.951877 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:09Z","lastTransitionTime":"2025-10-06T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.054473 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.054520 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.054551 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.054570 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.054582 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:10Z","lastTransitionTime":"2025-10-06T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.156702 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.156734 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.156744 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.156757 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.156766 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:10Z","lastTransitionTime":"2025-10-06T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.226002 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.226067 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.226071 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:10 crc kubenswrapper[4845]: E1006 06:46:10.226172 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.226222 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:10 crc kubenswrapper[4845]: E1006 06:46:10.226346 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:10 crc kubenswrapper[4845]: E1006 06:46:10.226475 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:10 crc kubenswrapper[4845]: E1006 06:46:10.226585 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.259128 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.259163 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.259172 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.259183 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.259195 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:10Z","lastTransitionTime":"2025-10-06T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.361281 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.361324 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.361334 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.361350 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.361361 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:10Z","lastTransitionTime":"2025-10-06T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.463956 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.464001 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.464012 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.464029 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.464041 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:10Z","lastTransitionTime":"2025-10-06T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.567423 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.567465 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.567473 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.567489 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.567501 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:10Z","lastTransitionTime":"2025-10-06T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.670260 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.670316 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.670332 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.670357 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.670399 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:10Z","lastTransitionTime":"2025-10-06T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.773460 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.773497 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.773507 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.773521 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.773531 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:10Z","lastTransitionTime":"2025-10-06T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.876064 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.876134 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.876157 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.876188 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.876213 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:10Z","lastTransitionTime":"2025-10-06T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.978286 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.978333 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.978345 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.978363 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:10 crc kubenswrapper[4845]: I1006 06:46:10.978399 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:10Z","lastTransitionTime":"2025-10-06T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.081202 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.081241 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.081250 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.081267 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.081280 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:11Z","lastTransitionTime":"2025-10-06T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.184178 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.184224 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.184235 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.184250 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.184260 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:11Z","lastTransitionTime":"2025-10-06T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.286528 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.286578 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.286591 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.286611 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.286627 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:11Z","lastTransitionTime":"2025-10-06T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.389018 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.389082 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.389105 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.389137 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.389163 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:11Z","lastTransitionTime":"2025-10-06T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.491156 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.491202 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.491211 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.491226 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.491236 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:11Z","lastTransitionTime":"2025-10-06T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.593571 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.593613 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.593621 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.593637 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.593649 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:11Z","lastTransitionTime":"2025-10-06T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.696599 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.696638 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.696652 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.696669 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.696682 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:11Z","lastTransitionTime":"2025-10-06T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.799919 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.799964 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.799974 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.799993 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.800005 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:11Z","lastTransitionTime":"2025-10-06T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.902953 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.902997 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.903012 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.903028 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:11 crc kubenswrapper[4845]: I1006 06:46:11.903041 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:11Z","lastTransitionTime":"2025-10-06T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.005358 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.005512 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.005532 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.005557 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.005577 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:12Z","lastTransitionTime":"2025-10-06T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.108208 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.108283 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.108291 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.108307 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.108318 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:12Z","lastTransitionTime":"2025-10-06T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.211233 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.211274 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.211291 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.211309 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.211321 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:12Z","lastTransitionTime":"2025-10-06T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.226729 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.226797 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.226734 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:12 crc kubenswrapper[4845]: E1006 06:46:12.226872 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.226897 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:12 crc kubenswrapper[4845]: E1006 06:46:12.227077 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:12 crc kubenswrapper[4845]: E1006 06:46:12.227147 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:12 crc kubenswrapper[4845]: E1006 06:46:12.227247 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.313644 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.313685 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.313698 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.313714 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.313725 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:12Z","lastTransitionTime":"2025-10-06T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.416996 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.417047 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.417060 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.417079 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.417095 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:12Z","lastTransitionTime":"2025-10-06T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.519207 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.519281 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.519299 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.519322 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.519338 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:12Z","lastTransitionTime":"2025-10-06T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.621168 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.621244 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.621260 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.621280 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.621307 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:12Z","lastTransitionTime":"2025-10-06T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.724548 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.724593 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.724604 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.724621 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.724642 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:12Z","lastTransitionTime":"2025-10-06T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.827277 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.827306 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.827344 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.827366 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.827395 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:12Z","lastTransitionTime":"2025-10-06T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.930142 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.930205 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.930223 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.930247 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:12 crc kubenswrapper[4845]: I1006 06:46:12.930265 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:12Z","lastTransitionTime":"2025-10-06T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.032315 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.032349 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.032358 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.032390 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.032399 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.134839 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.134868 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.134875 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.134888 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.134897 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.238696 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.238750 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.238767 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.238789 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.238806 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.341537 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.341599 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.341616 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.341640 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.341659 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.443700 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.443747 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.443755 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.443770 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.443780 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.546100 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.546146 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.546157 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.546172 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.546185 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.647868 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.647932 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.647953 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.647978 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.647999 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.750911 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.750967 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.750977 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.750995 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.751008 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.861812 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.861854 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.861866 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.861885 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.861897 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: E1006 06:46:13.874059 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:13Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.877769 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.877826 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.877844 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.877869 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.877888 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: E1006 06:46:13.890579 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:13Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.894758 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.894786 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.894797 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.894812 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.894823 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: E1006 06:46:13.912732 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:13Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.916857 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.916965 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.917011 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.917037 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.917054 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: E1006 06:46:13.929937 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:13Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.933881 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.933926 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.933938 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.933954 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.933965 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:13 crc kubenswrapper[4845]: E1006 06:46:13.945746 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:13Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:13 crc kubenswrapper[4845]: E1006 06:46:13.945899 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.947589 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.947628 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.947644 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.947667 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:13 crc kubenswrapper[4845]: I1006 06:46:13.947687 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:13Z","lastTransitionTime":"2025-10-06T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.053490 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.053551 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.053569 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.053594 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.053613 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:14Z","lastTransitionTime":"2025-10-06T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.156745 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.156807 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.156827 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.156853 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.156871 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:14Z","lastTransitionTime":"2025-10-06T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.225787 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.225821 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.225880 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.226208 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:14 crc kubenswrapper[4845]: E1006 06:46:14.226414 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:14 crc kubenswrapper[4845]: E1006 06:46:14.226500 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:14 crc kubenswrapper[4845]: E1006 06:46:14.226633 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:14 crc kubenswrapper[4845]: E1006 06:46:14.226828 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.258958 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.259015 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.259033 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.259057 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.259077 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:14Z","lastTransitionTime":"2025-10-06T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.362574 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.362619 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.362633 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.362650 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.362662 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:14Z","lastTransitionTime":"2025-10-06T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.467439 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.467506 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.467528 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.467556 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.467575 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:14Z","lastTransitionTime":"2025-10-06T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.571469 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.571540 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.571592 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.571630 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.571656 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:14Z","lastTransitionTime":"2025-10-06T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.674615 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.674677 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.674696 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.674721 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.674739 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:14Z","lastTransitionTime":"2025-10-06T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.777415 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.777474 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.777494 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.777519 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.777536 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:14Z","lastTransitionTime":"2025-10-06T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.879766 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.879863 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.879882 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.879904 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.879918 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:14Z","lastTransitionTime":"2025-10-06T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.952404 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.964116 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:14Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.980509 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:14Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.981863 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.981909 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.981922 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.981943 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.981955 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:14Z","lastTransitionTime":"2025-10-06T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:14 crc kubenswrapper[4845]: I1006 06:46:14.991894 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:14Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.006628 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.021301 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.035117 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.052439 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.066764 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.080643 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.084619 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.084664 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.084684 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.084705 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.084718 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:15Z","lastTransitionTime":"2025-10-06T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.093616 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.104608 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.117487 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.128504 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.138727 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff96c219-4289-4880-a8ab-ed6da7557dcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec9a2dc2d359f53f2165220af84f6ee28bfef0db39b86b12ebf600727e32e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b87b0a15c19a70cdb61338f704315d219bd382c5ced04b22851ef71a71464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e270831b0c56a937ba5cd52f2367bce7b5e8a2837c035aa4e112439829cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.154482 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.172280 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:05Z\\\",\\\"message\\\":\\\"4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 06:46:05.017663 6538 services_controller.go:434] Service openshift-machine-api/control-plane-machine-set-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{control-plane-machine-set-operator openshift-machine-api ffd0ef27-d28d-43cc-90c8-0e8843e4c04c 4409 0 2025-02-23 05:12:21 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:control-plane-machine-set-operator] map[capability.openshift.io/name:MachineAPI exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:control-plane-machine-set-operator-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075e04d7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: control-plane-machine-set-operator,},ClusterIP:10.217.4.41,Type:ClusterIP,ExternalIP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.182243 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:15Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.187018 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.187054 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.187064 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.187080 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.187090 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:15Z","lastTransitionTime":"2025-10-06T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.289190 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.289246 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.289254 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.289266 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.289296 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:15Z","lastTransitionTime":"2025-10-06T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.392072 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.392148 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.392174 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.392194 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.392207 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:15Z","lastTransitionTime":"2025-10-06T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.494563 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.494629 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.494647 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.494673 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.494691 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:15Z","lastTransitionTime":"2025-10-06T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.597270 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.597326 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.597344 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.597370 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.597438 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:15Z","lastTransitionTime":"2025-10-06T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.701455 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.701543 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.701564 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.701600 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.701620 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:15Z","lastTransitionTime":"2025-10-06T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.804458 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.804525 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.804544 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.804569 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.804587 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:15Z","lastTransitionTime":"2025-10-06T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.908664 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.908719 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.908732 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.908753 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:15 crc kubenswrapper[4845]: I1006 06:46:15.908767 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:15Z","lastTransitionTime":"2025-10-06T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.011130 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.011184 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.011201 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.011228 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.011242 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:16Z","lastTransitionTime":"2025-10-06T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.114534 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.114588 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.114604 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.114632 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.114650 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:16Z","lastTransitionTime":"2025-10-06T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.216984 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.217073 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.217088 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.217140 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.217161 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:16Z","lastTransitionTime":"2025-10-06T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.227653 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.227703 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.227718 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.227653 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:16 crc kubenswrapper[4845]: E1006 06:46:16.227798 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:16 crc kubenswrapper[4845]: E1006 06:46:16.227934 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:16 crc kubenswrapper[4845]: E1006 06:46:16.228006 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:16 crc kubenswrapper[4845]: E1006 06:46:16.228082 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.238838 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.249165 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.266758 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.276607 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.286063 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff96c219-4289-4880-a8ab-ed6da7557dcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec9a2dc2d359f53f2165220af84f6ee28bfef0db39b86b12ebf600727e32e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b87b0a15c19a70cdb61338f704315d219bd382c5ced04b22851ef71a71464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e270831b0c56a937ba5cd52f2367bce7b5e8a2837c035aa4e112439829cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.298096 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.317345 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:05Z\\\",\\\"message\\\":\\\"4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 06:46:05.017663 6538 services_controller.go:434] Service openshift-machine-api/control-plane-machine-set-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{control-plane-machine-set-operator openshift-machine-api ffd0ef27-d28d-43cc-90c8-0e8843e4c04c 4409 0 2025-02-23 05:12:21 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:control-plane-machine-set-operator] map[capability.openshift.io/name:MachineAPI exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:control-plane-machine-set-operator-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075e04d7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: control-plane-machine-set-operator,},ClusterIP:10.217.4.41,Type:ClusterIP,ExternalIP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.318955 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.318985 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.318995 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.319012 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.319025 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:16Z","lastTransitionTime":"2025-10-06T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.333267 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.344717 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.362152 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.371547 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.383020 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.392948 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.403580 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.416914 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.421773 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.421826 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.421838 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.421859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.421870 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:16Z","lastTransitionTime":"2025-10-06T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.431596 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.442550 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:16Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.523773 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.523846 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.523862 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.523889 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.523907 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:16Z","lastTransitionTime":"2025-10-06T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.627240 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.627306 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.627325 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.627353 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.627399 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:16Z","lastTransitionTime":"2025-10-06T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.729916 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.729975 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.729994 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.730019 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.730037 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:16Z","lastTransitionTime":"2025-10-06T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.833421 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.833478 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.833496 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.833519 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.833532 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:16Z","lastTransitionTime":"2025-10-06T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.936480 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.936743 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.936753 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.936767 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:16 crc kubenswrapper[4845]: I1006 06:46:16.936777 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:16Z","lastTransitionTime":"2025-10-06T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.039453 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.039504 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.039516 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.039589 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.039605 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:17Z","lastTransitionTime":"2025-10-06T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.142073 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.142113 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.142126 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.142145 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.142158 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:17Z","lastTransitionTime":"2025-10-06T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.246890 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.246992 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.247012 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.247041 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.247057 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:17Z","lastTransitionTime":"2025-10-06T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.349821 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.349869 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.349879 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.349896 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.349906 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:17Z","lastTransitionTime":"2025-10-06T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.452732 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.452803 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.452824 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.452862 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.452882 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:17Z","lastTransitionTime":"2025-10-06T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.556636 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.556735 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.556757 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.556801 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.556828 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:17Z","lastTransitionTime":"2025-10-06T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.660309 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.660390 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.660404 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.660426 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.660439 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:17Z","lastTransitionTime":"2025-10-06T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.763439 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.763481 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.763490 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.763508 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.763516 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:17Z","lastTransitionTime":"2025-10-06T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.866269 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.866313 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.866325 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.866342 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.866354 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:17Z","lastTransitionTime":"2025-10-06T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.968787 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.968831 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.968841 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.968856 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:17 crc kubenswrapper[4845]: I1006 06:46:17.968866 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:17Z","lastTransitionTime":"2025-10-06T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.071650 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.071690 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.071699 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.071715 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.071750 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:18Z","lastTransitionTime":"2025-10-06T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.174583 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.174629 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.174641 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.174658 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.174670 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:18Z","lastTransitionTime":"2025-10-06T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.226245 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.226245 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.226354 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.226482 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:18 crc kubenswrapper[4845]: E1006 06:46:18.226779 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:18 crc kubenswrapper[4845]: E1006 06:46:18.226885 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:18 crc kubenswrapper[4845]: E1006 06:46:18.226979 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:18 crc kubenswrapper[4845]: E1006 06:46:18.226659 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.277804 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.277873 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.277895 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.277926 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.277949 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:18Z","lastTransitionTime":"2025-10-06T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.380967 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.381001 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.381011 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.381024 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.381033 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:18Z","lastTransitionTime":"2025-10-06T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.483744 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.483826 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.483840 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.483888 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.483904 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:18Z","lastTransitionTime":"2025-10-06T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.587189 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.587244 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.587253 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.587271 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.587283 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:18Z","lastTransitionTime":"2025-10-06T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.690449 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.690526 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.690556 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.690586 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.690606 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:18Z","lastTransitionTime":"2025-10-06T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.793085 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.793138 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.793155 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.793179 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.793195 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:18Z","lastTransitionTime":"2025-10-06T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.895865 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.895907 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.895919 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.895937 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.895949 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:18Z","lastTransitionTime":"2025-10-06T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.998732 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.998768 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.998780 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.998797 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:18 crc kubenswrapper[4845]: I1006 06:46:18.998809 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:18Z","lastTransitionTime":"2025-10-06T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.100656 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.100692 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.100703 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.100721 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.100733 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:19Z","lastTransitionTime":"2025-10-06T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.203656 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.203694 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.203704 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.203719 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.203730 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:19Z","lastTransitionTime":"2025-10-06T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.227122 4845 scope.go:117] "RemoveContainer" containerID="b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c" Oct 06 06:46:19 crc kubenswrapper[4845]: E1006 06:46:19.227386 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.305406 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.305463 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.305479 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.305525 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.305539 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:19Z","lastTransitionTime":"2025-10-06T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.408341 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.408397 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.408407 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.408421 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.408431 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:19Z","lastTransitionTime":"2025-10-06T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.510803 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.510851 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.510869 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.510890 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.510907 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:19Z","lastTransitionTime":"2025-10-06T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.612342 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.612406 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.612421 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.612439 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.612451 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:19Z","lastTransitionTime":"2025-10-06T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.715177 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.715210 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.715219 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.715232 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.715241 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:19Z","lastTransitionTime":"2025-10-06T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.817778 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.817825 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.817838 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.817857 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.817871 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:19Z","lastTransitionTime":"2025-10-06T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.920994 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.921071 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.921100 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.921134 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:19 crc kubenswrapper[4845]: I1006 06:46:19.921152 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:19Z","lastTransitionTime":"2025-10-06T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.024573 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.024640 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.024663 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.024690 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.024710 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:20Z","lastTransitionTime":"2025-10-06T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.128032 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.128073 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.128085 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.128107 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.128118 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:20Z","lastTransitionTime":"2025-10-06T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.226309 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:20 crc kubenswrapper[4845]: E1006 06:46:20.226453 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.226331 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:20 crc kubenswrapper[4845]: E1006 06:46:20.226522 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.226309 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.226327 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:20 crc kubenswrapper[4845]: E1006 06:46:20.226767 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:20 crc kubenswrapper[4845]: E1006 06:46:20.226896 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.231097 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.231142 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.231158 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.231186 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.231204 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:20Z","lastTransitionTime":"2025-10-06T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.334470 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.334559 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.334589 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.334628 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.334652 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:20Z","lastTransitionTime":"2025-10-06T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.439308 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.439356 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.439367 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.439404 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.439414 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:20Z","lastTransitionTime":"2025-10-06T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.542062 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.542098 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.542107 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.542122 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.542131 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:20Z","lastTransitionTime":"2025-10-06T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.644290 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.644330 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.644340 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.644355 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.644366 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:20Z","lastTransitionTime":"2025-10-06T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.747088 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.747133 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.747145 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.747162 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.747176 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:20Z","lastTransitionTime":"2025-10-06T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.850117 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.850174 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.850184 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.850199 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.850209 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:20Z","lastTransitionTime":"2025-10-06T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.952865 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.952925 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.952944 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.952972 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:20 crc kubenswrapper[4845]: I1006 06:46:20.952992 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:20Z","lastTransitionTime":"2025-10-06T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.055507 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.055543 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.055552 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.055568 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.055580 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:21Z","lastTransitionTime":"2025-10-06T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.158701 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.158741 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.158750 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.158766 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.158775 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:21Z","lastTransitionTime":"2025-10-06T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.261177 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.261212 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.261221 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.261234 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.261245 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:21Z","lastTransitionTime":"2025-10-06T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.364121 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.364174 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.364186 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.364205 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.364218 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:21Z","lastTransitionTime":"2025-10-06T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.467148 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.467186 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.467196 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.467212 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.467226 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:21Z","lastTransitionTime":"2025-10-06T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.569757 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.569795 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.569804 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.569819 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.569828 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:21Z","lastTransitionTime":"2025-10-06T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.672040 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.672086 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.672098 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.672115 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.672126 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:21Z","lastTransitionTime":"2025-10-06T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.774311 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.774399 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.774418 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.774438 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.774456 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:21Z","lastTransitionTime":"2025-10-06T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.876990 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.877083 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.877098 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.877115 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.877128 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:21Z","lastTransitionTime":"2025-10-06T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.979848 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.979912 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.979935 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.979965 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:21 crc kubenswrapper[4845]: I1006 06:46:21.979984 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:21Z","lastTransitionTime":"2025-10-06T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.082605 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.082900 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.083000 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.083104 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.083196 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:22Z","lastTransitionTime":"2025-10-06T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.185601 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.185637 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.185648 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.185665 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.185679 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:22Z","lastTransitionTime":"2025-10-06T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.226615 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.226676 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.226707 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.226624 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:22 crc kubenswrapper[4845]: E1006 06:46:22.226755 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:22 crc kubenswrapper[4845]: E1006 06:46:22.226832 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:22 crc kubenswrapper[4845]: E1006 06:46:22.226929 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:22 crc kubenswrapper[4845]: E1006 06:46:22.227044 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.288042 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.288107 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.288122 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.288139 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.288150 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:22Z","lastTransitionTime":"2025-10-06T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.390833 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.390872 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.390881 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.390898 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.390911 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:22Z","lastTransitionTime":"2025-10-06T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.493020 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.493060 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.493073 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.493089 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.493103 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:22Z","lastTransitionTime":"2025-10-06T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.595253 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.595299 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.595309 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.595326 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.595337 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:22Z","lastTransitionTime":"2025-10-06T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.697465 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.697504 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.697515 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.697532 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.697543 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:22Z","lastTransitionTime":"2025-10-06T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.799911 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.799960 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.799974 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.799993 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.800007 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:22Z","lastTransitionTime":"2025-10-06T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.902655 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.902686 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.902694 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.902707 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:22 crc kubenswrapper[4845]: I1006 06:46:22.902718 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:22Z","lastTransitionTime":"2025-10-06T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.004053 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.004092 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.004105 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.004123 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.004134 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:23Z","lastTransitionTime":"2025-10-06T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.107006 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.107063 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.107081 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.107105 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.107124 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:23Z","lastTransitionTime":"2025-10-06T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.209253 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.209311 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.209338 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.209414 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.209443 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:23Z","lastTransitionTime":"2025-10-06T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.311687 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.311762 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.311784 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.311809 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.311827 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:23Z","lastTransitionTime":"2025-10-06T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.414133 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.414171 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.414182 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.414197 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.414212 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:23Z","lastTransitionTime":"2025-10-06T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.516184 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.516236 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.516252 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.516274 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.516293 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:23Z","lastTransitionTime":"2025-10-06T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.618052 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.618104 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.618114 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.618131 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.618142 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:23Z","lastTransitionTime":"2025-10-06T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.720745 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.720786 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.720795 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.720813 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.720825 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:23Z","lastTransitionTime":"2025-10-06T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.823874 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.823927 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.823943 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.823967 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.823985 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:23Z","lastTransitionTime":"2025-10-06T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.926275 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.926329 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.926348 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.926416 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:23 crc kubenswrapper[4845]: I1006 06:46:23.926454 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:23Z","lastTransitionTime":"2025-10-06T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.029193 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.029413 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.029444 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.029471 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.029493 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.074928 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:24 crc kubenswrapper[4845]: E1006 06:46:24.075067 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:46:24 crc kubenswrapper[4845]: E1006 06:46:24.075123 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs podName:f80a2f04-a041-4acb-ace9-c0e40aed5f6d nodeName:}" failed. No retries permitted until 2025-10-06 06:46:56.075108466 +0000 UTC m=+100.589849474 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs") pod "network-metrics-daemon-4l7qj" (UID: "f80a2f04-a041-4acb-ace9-c0e40aed5f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.131350 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.131414 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.131428 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.131447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.131460 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.226320 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.226448 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:24 crc kubenswrapper[4845]: E1006 06:46:24.226511 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:24 crc kubenswrapper[4845]: E1006 06:46:24.226635 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.226745 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:24 crc kubenswrapper[4845]: E1006 06:46:24.226844 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.226902 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:24 crc kubenswrapper[4845]: E1006 06:46:24.226997 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.233079 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.233189 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.233257 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.233318 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.233396 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.248121 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.248219 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.248289 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.248349 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.248605 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: E1006 06:46:24.263853 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:24Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.268296 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.268327 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.268335 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.268350 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.268361 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: E1006 06:46:24.288753 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:24Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.292498 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.292575 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.292599 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.292630 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.292654 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: E1006 06:46:24.310911 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:24Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.315488 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.315555 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.315574 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.315599 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.315620 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: E1006 06:46:24.335854 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:24Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.340101 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.340147 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.340161 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.340179 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.340191 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: E1006 06:46:24.356326 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:24Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:24 crc kubenswrapper[4845]: E1006 06:46:24.356667 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.358017 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.358052 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.358061 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.358076 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.358086 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.460172 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.460206 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.460217 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.460236 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.460248 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.562726 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.562759 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.562767 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.562781 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.562789 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.664466 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.664566 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.664591 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.664622 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.664642 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.768106 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.768302 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.768414 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.768520 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.768623 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.871119 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.871154 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.871165 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.871180 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.871190 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.973402 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.973436 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.973448 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.973463 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:24 crc kubenswrapper[4845]: I1006 06:46:24.973475 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:24Z","lastTransitionTime":"2025-10-06T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.077643 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.077675 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.077684 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.077699 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.077710 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:25Z","lastTransitionTime":"2025-10-06T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.180020 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.180050 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.180066 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.180082 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.180092 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:25Z","lastTransitionTime":"2025-10-06T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.283132 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.283172 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.283180 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.283197 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.283206 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:25Z","lastTransitionTime":"2025-10-06T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.385307 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.385345 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.385353 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.385368 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.385398 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:25Z","lastTransitionTime":"2025-10-06T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.487185 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.487215 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.487224 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.487237 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.487246 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:25Z","lastTransitionTime":"2025-10-06T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.588700 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.588734 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.588745 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.588759 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.588772 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:25Z","lastTransitionTime":"2025-10-06T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.665527 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zpn9l_2080026c-9eee-4863-b62d-e9ce4d4525dd/kube-multus/0.log" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.665579 4845 generic.go:334] "Generic (PLEG): container finished" podID="2080026c-9eee-4863-b62d-e9ce4d4525dd" containerID="8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992" exitCode=1 Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.665618 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zpn9l" event={"ID":"2080026c-9eee-4863-b62d-e9ce4d4525dd","Type":"ContainerDied","Data":"8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992"} Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.666553 4845 scope.go:117] "RemoveContainer" containerID="8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.681942 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.696901 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.697047 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.697129 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.697220 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.697300 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:25Z","lastTransitionTime":"2025-10-06T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.698238 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.713354 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.727822 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.745789 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.759199 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.772001 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.782479 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.796577 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.800656 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.800691 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.800703 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.800725 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.800735 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:25Z","lastTransitionTime":"2025-10-06T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.807084 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.818547 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff96c219-4289-4880-a8ab-ed6da7557dcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec9a2dc2d359f53f2165220af84f6ee28bfef0db39b86b12ebf600727e32e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b87b0a15c19a70cdb61338f704315d219bd382c5ced04b22851ef71a71464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e270831b0c56a937ba5cd52f2367bce7b5e8a2837c035aa4e112439829cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.831286 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.869784 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:05Z\\\",\\\"message\\\":\\\"4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 06:46:05.017663 6538 services_controller.go:434] Service openshift-machine-api/control-plane-machine-set-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{control-plane-machine-set-operator openshift-machine-api ffd0ef27-d28d-43cc-90c8-0e8843e4c04c 4409 0 2025-02-23 05:12:21 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:control-plane-machine-set-operator] map[capability.openshift.io/name:MachineAPI exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:control-plane-machine-set-operator-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075e04d7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: control-plane-machine-set-operator,},ClusterIP:10.217.4.41,Type:ClusterIP,ExternalIP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.885827 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.900242 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.902687 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.902742 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.902756 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.902778 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.902796 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:25Z","lastTransitionTime":"2025-10-06T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.911878 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:25Z\\\",\\\"message\\\":\\\"2025-10-06T06:45:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738\\\\n2025-10-06T06:45:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738 to /host/opt/cni/bin/\\\\n2025-10-06T06:45:40Z [verbose] multus-daemon started\\\\n2025-10-06T06:45:40Z [verbose] Readiness Indicator file check\\\\n2025-10-06T06:46:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:25 crc kubenswrapper[4845]: I1006 06:46:25.920363 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:25Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.004545 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.004574 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.004583 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.004596 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.004605 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:26Z","lastTransitionTime":"2025-10-06T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.106321 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.106356 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.106365 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.106392 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.106402 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:26Z","lastTransitionTime":"2025-10-06T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.208135 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.208195 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.208215 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.208239 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.208256 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:26Z","lastTransitionTime":"2025-10-06T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.226520 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.226629 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.226601 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.226558 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:26 crc kubenswrapper[4845]: E1006 06:46:26.226836 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:26 crc kubenswrapper[4845]: E1006 06:46:26.227011 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:26 crc kubenswrapper[4845]: E1006 06:46:26.227136 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:26 crc kubenswrapper[4845]: E1006 06:46:26.227237 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.238953 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.250093 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.265142 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.277144 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.287314 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.297023 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.307539 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.310210 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.310250 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.310298 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.310317 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.310338 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:26Z","lastTransitionTime":"2025-10-06T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.317396 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.334565 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.343930 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.352419 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.366498 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff96c219-4289-4880-a8ab-ed6da7557dcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec9a2dc2d359f53f2165220af84f6ee28bfef0db39b86b12ebf600727e32e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b87b0a15c19a70cdb61338f704315d219bd382c5ced04b22851ef71a71464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e270831b0c56a937ba5cd52f2367bce7b5e8a2837c035aa4e112439829cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.378413 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.402556 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:05Z\\\",\\\"message\\\":\\\"4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 06:46:05.017663 6538 services_controller.go:434] Service openshift-machine-api/control-plane-machine-set-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{control-plane-machine-set-operator openshift-machine-api ffd0ef27-d28d-43cc-90c8-0e8843e4c04c 4409 0 2025-02-23 05:12:21 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:control-plane-machine-set-operator] map[capability.openshift.io/name:MachineAPI exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:control-plane-machine-set-operator-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075e04d7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: control-plane-machine-set-operator,},ClusterIP:10.217.4.41,Type:ClusterIP,ExternalIP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.412659 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.412843 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.412966 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.412976 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.412989 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.412998 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:26Z","lastTransitionTime":"2025-10-06T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.423092 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:25Z\\\",\\\"message\\\":\\\"2025-10-06T06:45:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738\\\\n2025-10-06T06:45:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738 to /host/opt/cni/bin/\\\\n2025-10-06T06:45:40Z [verbose] multus-daemon started\\\\n2025-10-06T06:45:40Z [verbose] Readiness Indicator file check\\\\n2025-10-06T06:46:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.435349 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.515658 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.515692 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.515703 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.515721 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.515732 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:26Z","lastTransitionTime":"2025-10-06T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.618169 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.618202 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.618210 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.618223 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.618231 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:26Z","lastTransitionTime":"2025-10-06T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.670099 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zpn9l_2080026c-9eee-4863-b62d-e9ce4d4525dd/kube-multus/0.log" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.670158 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zpn9l" event={"ID":"2080026c-9eee-4863-b62d-e9ce4d4525dd","Type":"ContainerStarted","Data":"a5fb957ec713b15c373dd40160d79fa6038407e541c2603db92c1fa4f6e96959"} Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.679902 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.690448 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff96c219-4289-4880-a8ab-ed6da7557dcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec9a2dc2d359f53f2165220af84f6ee28bfef0db39b86b12ebf600727e32e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b87b0a15c19a70cdb61338f704315d219bd382c5ced04b22851ef71a71464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e270831b0c56a937ba5cd52f2367bce7b5e8a2837c035aa4e112439829cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.701309 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.716641 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:05Z\\\",\\\"message\\\":\\\"4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 06:46:05.017663 6538 services_controller.go:434] Service openshift-machine-api/control-plane-machine-set-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{control-plane-machine-set-operator openshift-machine-api ffd0ef27-d28d-43cc-90c8-0e8843e4c04c 4409 0 2025-02-23 05:12:21 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:control-plane-machine-set-operator] map[capability.openshift.io/name:MachineAPI exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:control-plane-machine-set-operator-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075e04d7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: control-plane-machine-set-operator,},ClusterIP:10.217.4.41,Type:ClusterIP,ExternalIP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.719723 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.719748 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.719758 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.719775 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.719785 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:26Z","lastTransitionTime":"2025-10-06T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.726728 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.737151 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5fb957ec713b15c373dd40160d79fa6038407e541c2603db92c1fa4f6e96959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:25Z\\\",\\\"message\\\":\\\"2025-10-06T06:45:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738\\\\n2025-10-06T06:45:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738 to /host/opt/cni/bin/\\\\n2025-10-06T06:45:40Z [verbose] multus-daemon started\\\\n2025-10-06T06:45:40Z [verbose] Readiness Indicator file check\\\\n2025-10-06T06:46:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.745281 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.755675 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.766131 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.777270 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.788257 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.798649 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.808353 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.817864 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.821887 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.821948 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.821957 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.821971 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.821981 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:26Z","lastTransitionTime":"2025-10-06T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.827858 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.841127 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.850512 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:26Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.924951 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.924985 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.924993 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.925006 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:26 crc kubenswrapper[4845]: I1006 06:46:26.925016 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:26Z","lastTransitionTime":"2025-10-06T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.026820 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.026857 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.026867 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.026883 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.026896 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:27Z","lastTransitionTime":"2025-10-06T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.129305 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.129348 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.129361 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.129395 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.129408 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:27Z","lastTransitionTime":"2025-10-06T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.231352 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.231418 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.231429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.231447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.231459 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:27Z","lastTransitionTime":"2025-10-06T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.333509 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.333554 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.333565 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.333582 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.333593 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:27Z","lastTransitionTime":"2025-10-06T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.439147 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.439206 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.439225 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.439249 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.439266 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:27Z","lastTransitionTime":"2025-10-06T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.542211 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.542253 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.542262 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.542277 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.542287 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:27Z","lastTransitionTime":"2025-10-06T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.645160 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.645219 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.645239 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.645266 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.645282 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:27Z","lastTransitionTime":"2025-10-06T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.747975 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.748017 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.748028 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.748049 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.748061 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:27Z","lastTransitionTime":"2025-10-06T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.850466 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.850765 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.850909 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.851041 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.851198 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:27Z","lastTransitionTime":"2025-10-06T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.953814 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.953864 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.953876 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.953894 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:27 crc kubenswrapper[4845]: I1006 06:46:27.953906 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:27Z","lastTransitionTime":"2025-10-06T06:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.056507 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.056546 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.056557 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.056573 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.056583 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:28Z","lastTransitionTime":"2025-10-06T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.159073 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.159133 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.159155 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.159185 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.159208 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:28Z","lastTransitionTime":"2025-10-06T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.226764 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.226813 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.226899 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:28 crc kubenswrapper[4845]: E1006 06:46:28.227005 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.227035 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:28 crc kubenswrapper[4845]: E1006 06:46:28.227084 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:28 crc kubenswrapper[4845]: E1006 06:46:28.227175 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:28 crc kubenswrapper[4845]: E1006 06:46:28.227313 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.261603 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.261647 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.261660 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.261674 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.261685 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:28Z","lastTransitionTime":"2025-10-06T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.364589 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.364624 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.364636 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.364654 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.364664 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:28Z","lastTransitionTime":"2025-10-06T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.466793 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.466852 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.466875 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.466910 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.466930 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:28Z","lastTransitionTime":"2025-10-06T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.568974 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.569036 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.569054 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.569111 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.569136 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:28Z","lastTransitionTime":"2025-10-06T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.672770 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.672802 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.672810 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.672825 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.672834 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:28Z","lastTransitionTime":"2025-10-06T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.775098 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.775127 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.775137 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.775150 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.775158 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:28Z","lastTransitionTime":"2025-10-06T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.877291 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.877347 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.877365 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.877423 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.877446 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:28Z","lastTransitionTime":"2025-10-06T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.979630 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.979702 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.979720 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.979746 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:28 crc kubenswrapper[4845]: I1006 06:46:28.979763 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:28Z","lastTransitionTime":"2025-10-06T06:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.082420 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.082460 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.082472 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.082489 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.082501 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:29Z","lastTransitionTime":"2025-10-06T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.184496 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.184551 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.184560 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.184577 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.184588 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:29Z","lastTransitionTime":"2025-10-06T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.286981 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.287034 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.287046 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.287067 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.287082 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:29Z","lastTransitionTime":"2025-10-06T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.389611 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.389695 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.389719 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.389751 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.389773 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:29Z","lastTransitionTime":"2025-10-06T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.492450 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.492499 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.492510 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.492532 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.492544 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:29Z","lastTransitionTime":"2025-10-06T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.595615 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.595666 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.595677 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.595698 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.595710 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:29Z","lastTransitionTime":"2025-10-06T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.697904 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.697965 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.697984 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.698008 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.698028 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:29Z","lastTransitionTime":"2025-10-06T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.801335 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.801434 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.801458 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.801479 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.801499 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:29Z","lastTransitionTime":"2025-10-06T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.903975 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.904051 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.904074 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.904108 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:29 crc kubenswrapper[4845]: I1006 06:46:29.904129 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:29Z","lastTransitionTime":"2025-10-06T06:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.007032 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.007089 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.007106 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.007130 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.007147 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:30Z","lastTransitionTime":"2025-10-06T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.110717 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.110790 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.110813 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.110844 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.110868 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:30Z","lastTransitionTime":"2025-10-06T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.214015 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.214070 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.214091 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.214113 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.214132 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:30Z","lastTransitionTime":"2025-10-06T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.226017 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.226066 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:30 crc kubenswrapper[4845]: E1006 06:46:30.226241 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.226352 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:30 crc kubenswrapper[4845]: E1006 06:46:30.226537 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.226628 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:30 crc kubenswrapper[4845]: E1006 06:46:30.226765 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:30 crc kubenswrapper[4845]: E1006 06:46:30.227009 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.237303 4845 scope.go:117] "RemoveContainer" containerID="b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.317316 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.317401 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.317414 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.317429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.317441 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:30Z","lastTransitionTime":"2025-10-06T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.420095 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.420213 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.420256 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.420277 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.420291 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:30Z","lastTransitionTime":"2025-10-06T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.524197 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.524250 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.524262 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.524281 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.524298 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:30Z","lastTransitionTime":"2025-10-06T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.626676 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.626703 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.626711 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.626726 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.626735 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:30Z","lastTransitionTime":"2025-10-06T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.683071 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/2.log" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.685096 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce"} Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.686064 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.698960 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.711569 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.728786 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.728823 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.728833 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.728860 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.728869 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:30Z","lastTransitionTime":"2025-10-06T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.732570 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.745106 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.767257 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff96c219-4289-4880-a8ab-ed6da7557dcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec9a2dc2d359f53f2165220af84f6ee28bfef0db39b86b12ebf600727e32e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b87b0a15c19a70cdb61338f704315d219bd382c5ced04b22851ef71a71464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e270831b0c56a937ba5cd52f2367bce7b5e8a2837c035aa4e112439829cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.779149 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.798115 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:05Z\\\",\\\"message\\\":\\\"4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 06:46:05.017663 6538 services_controller.go:434] Service openshift-machine-api/control-plane-machine-set-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{control-plane-machine-set-operator openshift-machine-api ffd0ef27-d28d-43cc-90c8-0e8843e4c04c 4409 0 2025-02-23 05:12:21 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:control-plane-machine-set-operator] map[capability.openshift.io/name:MachineAPI exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:control-plane-machine-set-operator-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075e04d7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: control-plane-machine-set-operator,},ClusterIP:10.217.4.41,Type:ClusterIP,ExternalIP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.808100 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.817459 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.829815 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5fb957ec713b15c373dd40160d79fa6038407e541c2603db92c1fa4f6e96959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:25Z\\\",\\\"message\\\":\\\"2025-10-06T06:45:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738\\\\n2025-10-06T06:45:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738 to /host/opt/cni/bin/\\\\n2025-10-06T06:45:40Z [verbose] multus-daemon started\\\\n2025-10-06T06:45:40Z [verbose] Readiness Indicator file check\\\\n2025-10-06T06:46:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.831159 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.831183 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.831192 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.831207 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.831217 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:30Z","lastTransitionTime":"2025-10-06T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.840501 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.852270 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.865189 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.875740 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.888186 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.900835 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.924229 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:30Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.932919 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.932942 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.932952 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.932966 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:30 crc kubenswrapper[4845]: I1006 06:46:30.932975 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:30Z","lastTransitionTime":"2025-10-06T06:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.035836 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.035885 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.036001 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.036048 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.036063 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:31Z","lastTransitionTime":"2025-10-06T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.138651 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.138730 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.138753 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.138784 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.138808 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:31Z","lastTransitionTime":"2025-10-06T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.241784 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.241846 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.241857 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.241879 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.241894 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:31Z","lastTransitionTime":"2025-10-06T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.344942 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.345026 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.345048 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.345085 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.345110 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:31Z","lastTransitionTime":"2025-10-06T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.448304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.448400 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.448420 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.448447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.448464 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:31Z","lastTransitionTime":"2025-10-06T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.552319 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.552402 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.552413 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.552432 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.552445 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:31Z","lastTransitionTime":"2025-10-06T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.655646 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.655716 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.655735 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.655767 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.655787 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:31Z","lastTransitionTime":"2025-10-06T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.693577 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/3.log" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.695297 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/2.log" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.699749 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce" exitCode=1 Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.699798 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce"} Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.699848 4845 scope.go:117] "RemoveContainer" containerID="b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.701048 4845 scope.go:117] "RemoveContainer" containerID="87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce" Oct 06 06:46:31 crc kubenswrapper[4845]: E1006 06:46:31.701497 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.722294 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.734924 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.759839 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.759628 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.759918 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.760193 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.760222 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.760241 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:31Z","lastTransitionTime":"2025-10-06T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.780128 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.797040 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff96c219-4289-4880-a8ab-ed6da7557dcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec9a2dc2d359f53f2165220af84f6ee28bfef0db39b86b12ebf600727e32e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b87b0a15c19a70cdb61338f704315d219bd382c5ced04b22851ef71a71464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e270831b0c56a937ba5cd52f2367bce7b5e8a2837c035aa4e112439829cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.814899 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.846613 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0e3798b99ad0b6634733a5806aa3414ed9e1e40869d2f7555bf4cfe28d7c21c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:05Z\\\",\\\"message\\\":\\\"4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 06:46:05.017663 6538 services_controller.go:434] Service openshift-machine-api/control-plane-machine-set-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{control-plane-machine-set-operator openshift-machine-api ffd0ef27-d28d-43cc-90c8-0e8843e4c04c 4409 0 2025-02-23 05:12:21 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:control-plane-machine-set-operator] map[capability.openshift.io/name:MachineAPI exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true service.alpha.openshift.io/serving-cert-secret-name:control-plane-machine-set-operator-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075e04d7 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: control-plane-machine-set-operator,},ClusterIP:10.217.4.41,Type:ClusterIP,ExternalIP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:31Z\\\",\\\"message\\\":\\\"red: retrying failed objects of type *v1.Pod\\\\nI1006 06:46:31.142169 6901 ovnkube.go:599] Stopped ovnkube\\\\nI1006 06:46:31.142195 6901 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1006 06:46:31.142170 6901 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-dns/node-resolver-689qf openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/machine-config-daemon-tpgm6 openshift-multus/multus-zpn9l openshift-multus/network-metrics-daemon-4l7qj openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/multus-additional-cni-plugins-m79r8 openshift-network-diagnostics/network-check-target-xd92c openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j]\\\\nF1006 06:46:31.142252 6901 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.860517 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.863143 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.863208 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.863224 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.863246 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.863262 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:31Z","lastTransitionTime":"2025-10-06T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.878048 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.896726 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5fb957ec713b15c373dd40160d79fa6038407e541c2603db92c1fa4f6e96959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:25Z\\\",\\\"message\\\":\\\"2025-10-06T06:45:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738\\\\n2025-10-06T06:45:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738 to /host/opt/cni/bin/\\\\n2025-10-06T06:45:40Z [verbose] multus-daemon started\\\\n2025-10-06T06:45:40Z [verbose] Readiness Indicator file check\\\\n2025-10-06T06:46:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.913509 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.935908 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.956833 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.966365 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.966561 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.966659 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.966745 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.966832 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:31Z","lastTransitionTime":"2025-10-06T06:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.977816 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:31 crc kubenswrapper[4845]: I1006 06:46:31.998258 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:31Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.018218 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.034427 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.070354 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.070444 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.070463 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.070489 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.070520 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:32Z","lastTransitionTime":"2025-10-06T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.173324 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.173454 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.173474 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.173499 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.173518 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:32Z","lastTransitionTime":"2025-10-06T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.226504 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.226564 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.226636 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.226673 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:32 crc kubenswrapper[4845]: E1006 06:46:32.226763 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:32 crc kubenswrapper[4845]: E1006 06:46:32.226879 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:32 crc kubenswrapper[4845]: E1006 06:46:32.227128 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:32 crc kubenswrapper[4845]: E1006 06:46:32.227245 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.277177 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.277241 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.277259 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.277286 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.277305 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:32Z","lastTransitionTime":"2025-10-06T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.380630 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.380667 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.380680 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.380696 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.380708 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:32Z","lastTransitionTime":"2025-10-06T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.483865 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.483919 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.483937 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.483961 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.483978 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:32Z","lastTransitionTime":"2025-10-06T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.587730 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.587794 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.587809 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.587831 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.587851 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:32Z","lastTransitionTime":"2025-10-06T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.691709 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.691771 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.691785 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.691810 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.691827 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:32Z","lastTransitionTime":"2025-10-06T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.705353 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/3.log" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.709288 4845 scope.go:117] "RemoveContainer" containerID="87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce" Oct 06 06:46:32 crc kubenswrapper[4845]: E1006 06:46:32.709470 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.725649 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.742089 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.759654 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.778880 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.794606 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.794649 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.794661 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.794681 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.794694 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:32Z","lastTransitionTime":"2025-10-06T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.797079 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.814998 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.832004 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.844517 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.866848 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.884901 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.896926 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.896964 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.896976 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.896993 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.897005 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:32Z","lastTransitionTime":"2025-10-06T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.904167 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff96c219-4289-4880-a8ab-ed6da7557dcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec9a2dc2d359f53f2165220af84f6ee28bfef0db39b86b12ebf600727e32e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b87b0a15c19a70cdb61338f704315d219bd382c5ced04b22851ef71a71464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e270831b0c56a937ba5cd52f2367bce7b5e8a2837c035aa4e112439829cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.924461 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.947443 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:31Z\\\",\\\"message\\\":\\\"red: retrying failed objects of type *v1.Pod\\\\nI1006 06:46:31.142169 6901 ovnkube.go:599] Stopped ovnkube\\\\nI1006 06:46:31.142195 6901 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1006 06:46:31.142170 6901 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-dns/node-resolver-689qf openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/machine-config-daemon-tpgm6 openshift-multus/multus-zpn9l openshift-multus/network-metrics-daemon-4l7qj openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/multus-additional-cni-plugins-m79r8 openshift-network-diagnostics/network-check-target-xd92c openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j]\\\\nF1006 06:46:31.142252 6901 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.959807 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.972385 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.987103 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5fb957ec713b15c373dd40160d79fa6038407e541c2603db92c1fa4f6e96959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:25Z\\\",\\\"message\\\":\\\"2025-10-06T06:45:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738\\\\n2025-10-06T06:45:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738 to /host/opt/cni/bin/\\\\n2025-10-06T06:45:40Z [verbose] multus-daemon started\\\\n2025-10-06T06:45:40Z [verbose] Readiness Indicator file check\\\\n2025-10-06T06:46:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.998353 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:32Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.999658 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.999690 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.999699 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:32 crc kubenswrapper[4845]: I1006 06:46:32.999714 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:32.999724 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:32Z","lastTransitionTime":"2025-10-06T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.102763 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.102886 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.102900 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.102942 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.102952 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:33Z","lastTransitionTime":"2025-10-06T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.205266 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.205310 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.205323 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.205344 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.205360 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:33Z","lastTransitionTime":"2025-10-06T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.308744 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.308871 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.308892 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.308919 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.308940 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:33Z","lastTransitionTime":"2025-10-06T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.411586 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.411639 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.411659 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.411684 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.411700 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:33Z","lastTransitionTime":"2025-10-06T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.513970 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.514034 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.514047 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.514062 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.514072 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:33Z","lastTransitionTime":"2025-10-06T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.616816 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.616862 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.616888 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.616911 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.616925 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:33Z","lastTransitionTime":"2025-10-06T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.720434 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.720503 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.720529 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.720564 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.720590 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:33Z","lastTransitionTime":"2025-10-06T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.824566 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.824625 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.824643 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.824670 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.824689 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:33Z","lastTransitionTime":"2025-10-06T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.928283 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.928334 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.928350 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.928389 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:33 crc kubenswrapper[4845]: I1006 06:46:33.928404 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:33Z","lastTransitionTime":"2025-10-06T06:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.031819 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.031896 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.031915 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.031944 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.031964 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.136228 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.136300 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.136313 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.136334 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.136349 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.226568 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.226586 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.226589 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.226764 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:34 crc kubenswrapper[4845]: E1006 06:46:34.227126 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:34 crc kubenswrapper[4845]: E1006 06:46:34.227267 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:34 crc kubenswrapper[4845]: E1006 06:46:34.227524 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:34 crc kubenswrapper[4845]: E1006 06:46:34.228048 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.238007 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.238072 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.238097 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.238122 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.238141 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.341674 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.341744 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.341762 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.341837 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.341857 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.388163 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.388208 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.388218 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.388236 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.388250 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: E1006 06:46:34.400718 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:34Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.404714 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.404756 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.404766 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.404805 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.404816 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: E1006 06:46:34.421230 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:34Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.426344 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.426442 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.426465 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.426494 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.426514 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: E1006 06:46:34.440867 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:34Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.445549 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.445658 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.445687 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.445726 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.445850 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: E1006 06:46:34.463333 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:34Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.468629 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.468679 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.468691 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.468708 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.468719 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: E1006 06:46:34.480175 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eab3b1f-e032-4e17-acfe-a00e1d48a232\\\",\\\"systemUUID\\\":\\\"f0a5d4d4-d5ce-4bb6-8016-8b16f8a9c985\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:34Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:34 crc kubenswrapper[4845]: E1006 06:46:34.480291 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.482044 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.482088 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.482097 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.482116 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.482126 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.584565 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.584611 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.584621 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.584639 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.584650 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.687343 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.687417 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.687427 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.687441 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.687451 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.789995 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.790049 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.790072 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.790094 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.790136 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.892148 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.892202 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.892219 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.892241 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.892257 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.994637 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.994674 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.994686 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.994702 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:34 crc kubenswrapper[4845]: I1006 06:46:34.994714 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:34Z","lastTransitionTime":"2025-10-06T06:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.096748 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.096803 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.096818 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.096837 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.096849 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:35Z","lastTransitionTime":"2025-10-06T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.198796 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.198832 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.198840 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.198855 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.198868 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:35Z","lastTransitionTime":"2025-10-06T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.239988 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.302293 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.302356 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.302370 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.302409 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.302422 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:35Z","lastTransitionTime":"2025-10-06T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.406348 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.406448 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.406467 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.406495 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.406514 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:35Z","lastTransitionTime":"2025-10-06T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.509523 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.509590 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.509610 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.509639 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.509659 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:35Z","lastTransitionTime":"2025-10-06T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.612519 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.612562 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.612573 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.612592 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.612604 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:35Z","lastTransitionTime":"2025-10-06T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.715592 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.715692 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.715725 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.715765 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.715791 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:35Z","lastTransitionTime":"2025-10-06T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.818827 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.818867 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.818876 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.818892 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.818905 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:35Z","lastTransitionTime":"2025-10-06T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.921555 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.921665 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.921684 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.921713 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:35 crc kubenswrapper[4845]: I1006 06:46:35.921732 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:35Z","lastTransitionTime":"2025-10-06T06:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.024713 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.024799 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.024829 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.024861 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.024885 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:36Z","lastTransitionTime":"2025-10-06T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.127289 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.127330 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.127338 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.127351 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.127360 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:36Z","lastTransitionTime":"2025-10-06T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.226397 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.226435 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:36 crc kubenswrapper[4845]: E1006 06:46:36.226568 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.226805 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.226819 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:36 crc kubenswrapper[4845]: E1006 06:46:36.226931 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:36 crc kubenswrapper[4845]: E1006 06:46:36.227025 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:36 crc kubenswrapper[4845]: E1006 06:46:36.227210 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.231412 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.231489 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.231514 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.231544 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.231579 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:36Z","lastTransitionTime":"2025-10-06T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.244789 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8bzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"999aace8-0c91-47c0-aee3-439e419a45c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad82a81a64a169660b3acaf7c27a48d6481b7f7a045ea08c63da16d8e7d105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jbr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8bzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.262987 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a08664ce-95f4-42b5-af53-414a26d50a99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5faf0556fb62654c41b0faf33af8e5c2600f152e9d1d56503350facfb81f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a84572ce22cbea68049578cfb55be5a69871367cd85827279d8f2bf54c46cac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84572ce22cbea68049578cfb55be5a69871367cd85827279d8f2bf54c46cac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.285259 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6936952c-09f0-48fd-8832-38c18202ae81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8723fc81762ad33ca72b2c925574609cbf9557916ae4f51016cd5c5868cedf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9zb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tpgm6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.306129 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zpn9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2080026c-9eee-4863-b62d-e9ce4d4525dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5fb957ec713b15c373dd40160d79fa6038407e541c2603db92c1fa4f6e96959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:25Z\\\",\\\"message\\\":\\\"2025-10-06T06:45:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738\\\\n2025-10-06T06:45:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c6aa4b7-a6f3-4775-936a-c7499910a738 to /host/opt/cni/bin/\\\\n2025-10-06T06:45:40Z [verbose] multus-daemon started\\\\n2025-10-06T06:45:40Z [verbose] Readiness Indicator file check\\\\n2025-10-06T06:46:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2xhsv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zpn9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.329021 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5c258adfea3b70f9df6106782015cb5ed830c9e781ab41f238dbbb4df68090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://803f9e6b141c90c6ad08e1ff07da74bb265d8ba6d7c2f74169ee32b5c3054c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.333801 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.333856 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.333872 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.333933 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.333952 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:36Z","lastTransitionTime":"2025-10-06T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.348444 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.365260 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.384966 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02fbdb16-0e21-465c-bf84-0c5e4a6b2ac9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30da526d6fcbb9274f368a925fac71a765ab5d636931a67f7ca19f25548969ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a2b67999a1775d81cbcf105a71c5542173f9451d2e836b1341377bb254d581\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d02fee6a427dd6bc5f301de2ec4aa370dc90bf891b92e73f7fdbabf5390629e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd70decc01c59307c53f001633b131d20f897ba9825503e5d557fa02d39406f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526493e931c67e2a46a7be034a884ac00730ca4a32a0df2734982d539ddd38cb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 06:45:36.406174 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 06:45:36.406317 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 06:45:36.407077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-903392046/tls.crt::/tmp/serving-cert-903392046/tls.key\\\\\\\"\\\\nI1006 06:45:36.643545 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 06:45:36.652565 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 06:45:36.652586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 06:45:36.652613 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 06:45:36.652620 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 06:45:36.658509 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1006 06:45:36.658522 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 06:45:36.658551 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 06:45:36.658560 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 06:45:36.658593 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 06:45:36.658599 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 06:45:36.658603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 06:45:36.660661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c937d0a01ca48f2a5af3b5aabb947b77c6db740ca8b6a8eb5893ccabdc9e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eebac6085a04ff10f05cead4fab021e910c5a4d70c3de087c706fd0c213486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.400710 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"128872e4-d4fb-4309-bbc3-47f8254f459d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46bfce908e0f2adbab7b39ad8ace2ca570265f57b268848dcb72555fbd015b27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f704940f16e58d681ce9e6ae866bf1f460db7fd0a91f3b0431735cbcb260309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3372e5778ae50c8983a1bd0a2fc5f324e6452355d64e6c140e9aed7e955d9878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a27a95d8ef44e2e34624224c120b716bdf1db52f4dea2a7e4067c9e41ee7c6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.415168 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f60dabf85d16469d5841575c07795c7e5db35e6d1e4bce1da6f04506416ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.428316 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce71ec75-9d46-43ff-a08e-430ef60a6d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2634dfb4315764b425165e34de939bda7cddade0d5a837a3d88e35d0e13bcce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07a832ca018ee1d21598e72fa0fd375e08549e9298392b50cb87a4cfc20b9e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lplhk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-smc4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.437544 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.437637 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.437663 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.437697 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.437716 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:36Z","lastTransitionTime":"2025-10-06T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.442075 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.453763 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-689qf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"453226ed-506e-48cb-89a8-a03ca92660e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93575903d778fd551c97960f9588f6c7962812e5a9f054b6db4c8f1cdf996f68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-689qf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.476858 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-m79r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"331140be-ed04-4023-b244-31f5817b8803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42d4ab9df56eb72015fd0347089438f60c1a72fa9265fd17a0db1aad00d7c787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e951116c8eb4561381ac7412630293b9359d175c25b4ee579ff4d520cf9cd69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2134a6eda7a067adc3f0aec035a2e6273cc2aa64d09318ab2daedfbe8d703fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f719a160d70d821fef8576b1b796e4a9f3a878845e2426924dc7ff23a4f6aa7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f9976f9c2c4e88767e8c0eb48ef92a01514171dd59e08bc057443bc3533636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd25e8cbc7b64078217822b3b3ee0f51bf880039693844f72d07208786b3698\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91d5e83b1769f9bac7bde451982d59e69de50845dd0accd48991c23a45400212\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rl2v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-m79r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.501519 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58772108-964d-4d0c-90a4-70ad5fe1da2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T06:46:31Z\\\",\\\"message\\\":\\\"red: retrying failed objects of type *v1.Pod\\\\nI1006 06:46:31.142169 6901 ovnkube.go:599] Stopped ovnkube\\\\nI1006 06:46:31.142195 6901 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1006 06:46:31.142170 6901 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-dns/node-resolver-689qf openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/machine-config-daemon-tpgm6 openshift-multus/multus-zpn9l openshift-multus/network-metrics-daemon-4l7qj openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/multus-additional-cni-plugins-m79r8 openshift-network-diagnostics/network-check-target-xd92c openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j]\\\\nF1006 06:46:31.142252 6901 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T06:46:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zd8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-587xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.517338 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4l7qj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.531215 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff96c219-4289-4880-a8ab-ed6da7557dcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:46:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec9a2dc2d359f53f2165220af84f6ee28bfef0db39b86b12ebf600727e32e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03b87b0a15c19a70cdb61338f704315d219bd382c5ced04b22851ef71a71464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e270831b0c56a937ba5cd52f2367bce7b5e8a2837c035aa4e112439829cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://949700a9b613f1e8de0ad3f9f89e8e55f24c30b28bf1379d52a8e4253c413872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T06:45:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T06:45:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T06:45:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.540890 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.541060 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.541148 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.541240 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.541345 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:36Z","lastTransitionTime":"2025-10-06T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.547889 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T06:45:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://480605d076ab00903792208218f53aa4c8e692e07bf20b83ee5497b94ca7f374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T06:45:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T06:46:36Z is after 2025-08-24T17:21:41Z" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.643698 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.643793 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.643815 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.643841 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.643859 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:36Z","lastTransitionTime":"2025-10-06T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.746355 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.746434 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.746452 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.746474 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.746490 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:36Z","lastTransitionTime":"2025-10-06T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.849269 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.849324 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.849334 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.849347 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.849356 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:36Z","lastTransitionTime":"2025-10-06T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.952216 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.952252 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.952261 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.952275 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:36 crc kubenswrapper[4845]: I1006 06:46:36.952284 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:36Z","lastTransitionTime":"2025-10-06T06:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.054143 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.054218 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.054243 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.054271 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.054294 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:37Z","lastTransitionTime":"2025-10-06T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.156294 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.156361 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.156405 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.156438 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.156461 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:37Z","lastTransitionTime":"2025-10-06T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.259709 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.259843 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.259866 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.259882 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.259895 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:37Z","lastTransitionTime":"2025-10-06T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.363226 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.363274 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.363295 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.363320 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.363339 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:37Z","lastTransitionTime":"2025-10-06T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.466912 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.466952 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.467006 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.467028 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.467041 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:37Z","lastTransitionTime":"2025-10-06T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.571705 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.571754 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.571763 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.571781 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.571791 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:37Z","lastTransitionTime":"2025-10-06T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.674531 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.674945 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.674965 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.674992 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.675010 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:37Z","lastTransitionTime":"2025-10-06T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.778065 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.778137 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.778160 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.778189 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.778210 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:37Z","lastTransitionTime":"2025-10-06T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.880754 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.880876 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.880902 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.880933 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.880955 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:37Z","lastTransitionTime":"2025-10-06T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.984024 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.984079 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.984100 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.984125 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:37 crc kubenswrapper[4845]: I1006 06:46:37.984143 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:37Z","lastTransitionTime":"2025-10-06T06:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.087221 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.087257 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.087267 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.087299 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.087310 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:38Z","lastTransitionTime":"2025-10-06T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.190628 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.190666 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.190678 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.190694 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.190706 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:38Z","lastTransitionTime":"2025-10-06T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.226673 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.226697 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.226697 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:38 crc kubenswrapper[4845]: E1006 06:46:38.226875 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.227077 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:38 crc kubenswrapper[4845]: E1006 06:46:38.227220 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:38 crc kubenswrapper[4845]: E1006 06:46:38.227269 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:38 crc kubenswrapper[4845]: E1006 06:46:38.227336 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.293996 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.294285 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.294356 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.294493 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.294562 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:38Z","lastTransitionTime":"2025-10-06T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.397407 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.397655 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.397741 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.397850 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.397957 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:38Z","lastTransitionTime":"2025-10-06T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.500689 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.500740 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.500756 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.500775 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.500789 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:38Z","lastTransitionTime":"2025-10-06T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.602769 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.603040 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.603132 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.603211 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.603281 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:38Z","lastTransitionTime":"2025-10-06T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.705490 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.705534 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.705543 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.705561 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.705572 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:38Z","lastTransitionTime":"2025-10-06T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.808023 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.808061 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.808071 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.808086 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.808097 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:38Z","lastTransitionTime":"2025-10-06T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.911409 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.911945 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.912044 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.912120 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:38 crc kubenswrapper[4845]: I1006 06:46:38.912214 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:38Z","lastTransitionTime":"2025-10-06T06:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.014576 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.014606 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.014615 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.014629 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.014637 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:39Z","lastTransitionTime":"2025-10-06T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.117792 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.118116 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.118245 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.118406 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.118509 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:39Z","lastTransitionTime":"2025-10-06T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.220946 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.221010 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.221021 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.221034 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.221044 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:39Z","lastTransitionTime":"2025-10-06T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.323434 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.323496 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.323513 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.323539 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.323579 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:39Z","lastTransitionTime":"2025-10-06T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.426350 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.426685 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.426857 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.427007 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.427157 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:39Z","lastTransitionTime":"2025-10-06T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.530768 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.531020 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.531143 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.531253 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.531494 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:39Z","lastTransitionTime":"2025-10-06T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.634789 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.634855 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.634877 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.634905 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.634926 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:39Z","lastTransitionTime":"2025-10-06T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.736842 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.736914 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.737004 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.737040 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.737063 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:39Z","lastTransitionTime":"2025-10-06T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.840304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.840412 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.840432 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.840455 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.840471 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:39Z","lastTransitionTime":"2025-10-06T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.943251 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.943325 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.943351 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.943423 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:39 crc kubenswrapper[4845]: I1006 06:46:39.943441 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:39Z","lastTransitionTime":"2025-10-06T06:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.046614 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.046663 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.046680 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.046706 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.046725 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:40Z","lastTransitionTime":"2025-10-06T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.153572 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.153660 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.153677 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.153699 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.153718 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:40Z","lastTransitionTime":"2025-10-06T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.226652 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.226682 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.226829 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.227019 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.227054 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.227189 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.227335 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.227555 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.255358 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.255440 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.255460 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.255482 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.255517 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:40Z","lastTransitionTime":"2025-10-06T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.355626 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.355811 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.355774304 +0000 UTC m=+148.870515352 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.355868 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.356023 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.356086 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.356120 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.356193 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.356262 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.356269 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.356325 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.356310976 +0000 UTC m=+148.871052024 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.356334 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.356282 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.356418 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.356441 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.356450 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.356394 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.356346277 +0000 UTC m=+148.871087305 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.356608 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.356568503 +0000 UTC m=+148.871309551 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:46:40 crc kubenswrapper[4845]: E1006 06:46:40.356658 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.356641424 +0000 UTC m=+148.871382472 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.358499 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.358550 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.358569 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.358593 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.358614 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:40Z","lastTransitionTime":"2025-10-06T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.462411 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.462458 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.462470 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.462487 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.462498 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:40Z","lastTransitionTime":"2025-10-06T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.565342 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.565413 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.565431 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.565456 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.565472 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:40Z","lastTransitionTime":"2025-10-06T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.668545 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.668603 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.668630 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.668653 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.668671 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:40Z","lastTransitionTime":"2025-10-06T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.770843 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.770876 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.770885 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.770897 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.770907 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:40Z","lastTransitionTime":"2025-10-06T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.873294 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.873328 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.873339 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.873354 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.873362 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:40Z","lastTransitionTime":"2025-10-06T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.975832 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.975868 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.975877 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.975890 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:40 crc kubenswrapper[4845]: I1006 06:46:40.975899 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:40Z","lastTransitionTime":"2025-10-06T06:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.078136 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.078199 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.078212 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.078250 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.078267 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:41Z","lastTransitionTime":"2025-10-06T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.181793 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.181849 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.181858 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.181873 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.181882 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:41Z","lastTransitionTime":"2025-10-06T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.284480 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.284532 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.284542 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.284555 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.284565 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:41Z","lastTransitionTime":"2025-10-06T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.387106 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.387150 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.387161 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.387179 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.387190 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:41Z","lastTransitionTime":"2025-10-06T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.489391 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.489446 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.489456 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.489469 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.489478 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:41Z","lastTransitionTime":"2025-10-06T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.592162 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.592208 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.592225 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.592251 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.592271 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:41Z","lastTransitionTime":"2025-10-06T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.695122 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.695236 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.695267 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.695295 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.695318 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:41Z","lastTransitionTime":"2025-10-06T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.797738 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.797793 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.797812 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.797840 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.797857 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:41Z","lastTransitionTime":"2025-10-06T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.901281 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.901326 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.901335 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.901351 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:41 crc kubenswrapper[4845]: I1006 06:46:41.901361 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:41Z","lastTransitionTime":"2025-10-06T06:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.003559 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.003588 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.003598 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.003613 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.003622 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:42Z","lastTransitionTime":"2025-10-06T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.105240 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.105502 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.105520 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.105537 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.105549 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:42Z","lastTransitionTime":"2025-10-06T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.207389 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.207417 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.207425 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.207438 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.207447 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:42Z","lastTransitionTime":"2025-10-06T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.225974 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.226012 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.225987 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:42 crc kubenswrapper[4845]: E1006 06:46:42.226150 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.226178 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:42 crc kubenswrapper[4845]: E1006 06:46:42.226239 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:42 crc kubenswrapper[4845]: E1006 06:46:42.226340 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:42 crc kubenswrapper[4845]: E1006 06:46:42.226460 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.310239 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.310272 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.310283 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.310299 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.310333 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:42Z","lastTransitionTime":"2025-10-06T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.413634 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.413681 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.413697 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.413718 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.413734 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:42Z","lastTransitionTime":"2025-10-06T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.516241 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.516300 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.516326 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.516354 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.516403 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:42Z","lastTransitionTime":"2025-10-06T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.618686 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.618723 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.618732 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.618747 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.618757 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:42Z","lastTransitionTime":"2025-10-06T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.720976 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.721052 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.721077 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.721104 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.721123 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:42Z","lastTransitionTime":"2025-10-06T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.823844 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.823913 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.823937 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.823970 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.823996 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:42Z","lastTransitionTime":"2025-10-06T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.926986 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.927047 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.927064 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.927091 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:42 crc kubenswrapper[4845]: I1006 06:46:42.927108 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:42Z","lastTransitionTime":"2025-10-06T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.029897 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.029948 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.029964 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.029989 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.030007 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:43Z","lastTransitionTime":"2025-10-06T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.133068 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.133128 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.133145 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.133168 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.133186 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:43Z","lastTransitionTime":"2025-10-06T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.236818 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.236874 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.236890 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.236915 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.236931 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:43Z","lastTransitionTime":"2025-10-06T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.339980 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.340029 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.340046 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.340069 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.340087 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:43Z","lastTransitionTime":"2025-10-06T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.443358 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.443457 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.443482 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.443513 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.443535 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:43Z","lastTransitionTime":"2025-10-06T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.546227 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.546279 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.546297 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.546319 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.546334 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:43Z","lastTransitionTime":"2025-10-06T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.649097 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.649126 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.649136 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.649151 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.649161 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:43Z","lastTransitionTime":"2025-10-06T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.751520 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.751587 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.751610 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.751640 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.751667 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:43Z","lastTransitionTime":"2025-10-06T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.854298 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.854355 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.854401 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.854433 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.854457 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:43Z","lastTransitionTime":"2025-10-06T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.957856 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.957921 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.957941 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.957989 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:43 crc kubenswrapper[4845]: I1006 06:46:43.958013 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:43Z","lastTransitionTime":"2025-10-06T06:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.060497 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.060545 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.060565 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.060588 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.060604 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:44Z","lastTransitionTime":"2025-10-06T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.165250 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.165310 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.165327 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.165352 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.165397 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:44Z","lastTransitionTime":"2025-10-06T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.226440 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.226495 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.226453 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:44 crc kubenswrapper[4845]: E1006 06:46:44.226653 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:44 crc kubenswrapper[4845]: E1006 06:46:44.226728 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.226781 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:44 crc kubenswrapper[4845]: E1006 06:46:44.226820 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:44 crc kubenswrapper[4845]: E1006 06:46:44.226929 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.268083 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.268161 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.268181 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.268205 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.268222 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:44Z","lastTransitionTime":"2025-10-06T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.373034 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.373061 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.373070 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.373082 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.373091 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:44Z","lastTransitionTime":"2025-10-06T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.475444 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.475507 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.475518 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.475534 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.475544 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:44Z","lastTransitionTime":"2025-10-06T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.578107 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.578171 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.578193 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.578221 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.578242 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:44Z","lastTransitionTime":"2025-10-06T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.681300 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.681337 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.681345 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.681362 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.681431 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:44Z","lastTransitionTime":"2025-10-06T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.784195 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.784234 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.784246 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.784260 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.784268 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:44Z","lastTransitionTime":"2025-10-06T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.877817 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.877876 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.877893 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.877917 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.877936 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:44Z","lastTransitionTime":"2025-10-06T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.909738 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.909777 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.909786 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.909801 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.909811 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T06:46:44Z","lastTransitionTime":"2025-10-06T06:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.931625 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv"] Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.932041 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.934416 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.935169 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.935234 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.935289 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.955402 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zpn9l" podStartSLOduration=68.955226047 podStartE2EDuration="1m8.955226047s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:46:44.951817747 +0000 UTC m=+89.466558765" watchObservedRunningTime="2025-10-06 06:46:44.955226047 +0000 UTC m=+89.469967065" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.975862 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8bzqb" podStartSLOduration=67.975840315 podStartE2EDuration="1m7.975840315s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:46:44.963215886 +0000 UTC m=+89.477956944" watchObservedRunningTime="2025-10-06 06:46:44.975840315 +0000 UTC m=+89.490581333" Oct 06 06:46:44 crc kubenswrapper[4845]: I1006 06:46:44.999630 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=9.999611347 podStartE2EDuration="9.999611347s" podCreationTimestamp="2025-10-06 06:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:46:44.987116742 +0000 UTC m=+89.501857770" watchObservedRunningTime="2025-10-06 06:46:44.999611347 +0000 UTC m=+89.514352355" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.001809 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0606493-8050-4be6-b715-4db60022da3c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.001856 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e0606493-8050-4be6-b715-4db60022da3c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.001872 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0606493-8050-4be6-b715-4db60022da3c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.001906 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0606493-8050-4be6-b715-4db60022da3c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.001927 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e0606493-8050-4be6-b715-4db60022da3c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.011721 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podStartSLOduration=69.011704453 podStartE2EDuration="1m9.011704453s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:46:45.000009267 +0000 UTC m=+89.514750315" watchObservedRunningTime="2025-10-06 06:46:45.011704453 +0000 UTC m=+89.526445471" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.087275 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=66.08725494 podStartE2EDuration="1m6.08725494s" podCreationTimestamp="2025-10-06 06:45:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:46:45.086889901 +0000 UTC m=+89.601630929" watchObservedRunningTime="2025-10-06 06:46:45.08725494 +0000 UTC m=+89.601995958" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.087841 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.087831883 podStartE2EDuration="1m8.087831883s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:46:45.074859607 +0000 UTC m=+89.589600615" watchObservedRunningTime="2025-10-06 06:46:45.087831883 +0000 UTC m=+89.602572901" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.102604 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0606493-8050-4be6-b715-4db60022da3c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.102665 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e0606493-8050-4be6-b715-4db60022da3c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.102656 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m79r8" podStartSLOduration=69.102642854 podStartE2EDuration="1m9.102642854s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:46:45.102031189 +0000 UTC m=+89.616772207" watchObservedRunningTime="2025-10-06 06:46:45.102642854 +0000 UTC m=+89.617383862" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.102700 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0606493-8050-4be6-b715-4db60022da3c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.102759 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0606493-8050-4be6-b715-4db60022da3c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.102794 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e0606493-8050-4be6-b715-4db60022da3c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.102817 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e0606493-8050-4be6-b715-4db60022da3c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.102881 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e0606493-8050-4be6-b715-4db60022da3c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.103538 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0606493-8050-4be6-b715-4db60022da3c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.107953 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0606493-8050-4be6-b715-4db60022da3c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.126962 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0606493-8050-4be6-b715-4db60022da3c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x8zkv\" (UID: \"e0606493-8050-4be6-b715-4db60022da3c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.130164 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smc4j" podStartSLOduration=68.130152784 podStartE2EDuration="1m8.130152784s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:46:45.115298313 +0000 UTC m=+89.630039321" watchObservedRunningTime="2025-10-06 06:46:45.130152784 +0000 UTC m=+89.644893792" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.141248 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-689qf" podStartSLOduration=69.141229016 podStartE2EDuration="1m9.141229016s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:46:45.140783266 +0000 UTC m=+89.655524274" watchObservedRunningTime="2025-10-06 06:46:45.141229016 +0000 UTC m=+89.655970044" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.203211 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.203196552 podStartE2EDuration="39.203196552s" podCreationTimestamp="2025-10-06 06:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:46:45.202320581 +0000 UTC m=+89.717061589" watchObservedRunningTime="2025-10-06 06:46:45.203196552 +0000 UTC m=+89.717937560" Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.245608 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" Oct 06 06:46:45 crc kubenswrapper[4845]: W1006 06:46:45.257175 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0606493_8050_4be6_b715_4db60022da3c.slice/crio-0447da4d441f45bffbcf192d9055e140a729e85af503b8b9b0213330589ce981 WatchSource:0}: Error finding container 0447da4d441f45bffbcf192d9055e140a729e85af503b8b9b0213330589ce981: Status 404 returned error can't find the container with id 0447da4d441f45bffbcf192d9055e140a729e85af503b8b9b0213330589ce981 Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.751929 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" event={"ID":"e0606493-8050-4be6-b715-4db60022da3c","Type":"ContainerStarted","Data":"45f6988380a89af44e620f6637ff80e16f84ab82289b835d09e8e9c550ecb3f7"} Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.752013 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" event={"ID":"e0606493-8050-4be6-b715-4db60022da3c","Type":"ContainerStarted","Data":"0447da4d441f45bffbcf192d9055e140a729e85af503b8b9b0213330589ce981"} Oct 06 06:46:45 crc kubenswrapper[4845]: I1006 06:46:45.773792 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x8zkv" podStartSLOduration=69.773765256 podStartE2EDuration="1m9.773765256s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:46:45.770661162 +0000 UTC m=+90.285402240" watchObservedRunningTime="2025-10-06 06:46:45.773765256 +0000 UTC m=+90.288506304" Oct 06 06:46:46 crc kubenswrapper[4845]: I1006 06:46:46.226576 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:46 crc kubenswrapper[4845]: I1006 06:46:46.226597 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:46 crc kubenswrapper[4845]: I1006 06:46:46.226686 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:46 crc kubenswrapper[4845]: E1006 06:46:46.226784 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:46 crc kubenswrapper[4845]: E1006 06:46:46.227010 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:46 crc kubenswrapper[4845]: E1006 06:46:46.227123 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:46 crc kubenswrapper[4845]: I1006 06:46:46.226597 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:46 crc kubenswrapper[4845]: E1006 06:46:46.227669 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:48 crc kubenswrapper[4845]: I1006 06:46:48.226116 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:48 crc kubenswrapper[4845]: I1006 06:46:48.226161 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:48 crc kubenswrapper[4845]: I1006 06:46:48.226166 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:48 crc kubenswrapper[4845]: E1006 06:46:48.226251 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:48 crc kubenswrapper[4845]: E1006 06:46:48.226358 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:48 crc kubenswrapper[4845]: I1006 06:46:48.226447 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:48 crc kubenswrapper[4845]: E1006 06:46:48.226673 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:48 crc kubenswrapper[4845]: E1006 06:46:48.226703 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:48 crc kubenswrapper[4845]: I1006 06:46:48.227076 4845 scope.go:117] "RemoveContainer" containerID="87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce" Oct 06 06:46:48 crc kubenswrapper[4845]: E1006 06:46:48.227287 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" Oct 06 06:46:50 crc kubenswrapper[4845]: I1006 06:46:50.226024 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:50 crc kubenswrapper[4845]: I1006 06:46:50.226028 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:50 crc kubenswrapper[4845]: I1006 06:46:50.226075 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:50 crc kubenswrapper[4845]: E1006 06:46:50.226157 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:50 crc kubenswrapper[4845]: I1006 06:46:50.226220 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:50 crc kubenswrapper[4845]: E1006 06:46:50.226335 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:50 crc kubenswrapper[4845]: E1006 06:46:50.226400 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:50 crc kubenswrapper[4845]: E1006 06:46:50.226466 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:52 crc kubenswrapper[4845]: I1006 06:46:52.226172 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:52 crc kubenswrapper[4845]: I1006 06:46:52.226188 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:52 crc kubenswrapper[4845]: E1006 06:46:52.226798 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:52 crc kubenswrapper[4845]: I1006 06:46:52.226397 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:52 crc kubenswrapper[4845]: I1006 06:46:52.226258 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:52 crc kubenswrapper[4845]: E1006 06:46:52.227037 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:52 crc kubenswrapper[4845]: E1006 06:46:52.227187 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:52 crc kubenswrapper[4845]: E1006 06:46:52.227416 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:54 crc kubenswrapper[4845]: I1006 06:46:54.226073 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:54 crc kubenswrapper[4845]: I1006 06:46:54.226190 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:54 crc kubenswrapper[4845]: E1006 06:46:54.226285 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:54 crc kubenswrapper[4845]: I1006 06:46:54.226357 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:54 crc kubenswrapper[4845]: I1006 06:46:54.226399 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:54 crc kubenswrapper[4845]: E1006 06:46:54.226508 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:54 crc kubenswrapper[4845]: E1006 06:46:54.226613 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:54 crc kubenswrapper[4845]: E1006 06:46:54.226767 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:55 crc kubenswrapper[4845]: I1006 06:46:55.241514 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 06 06:46:56 crc kubenswrapper[4845]: I1006 06:46:56.126831 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:56 crc kubenswrapper[4845]: E1006 06:46:56.126940 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:46:56 crc kubenswrapper[4845]: E1006 06:46:56.126984 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs podName:f80a2f04-a041-4acb-ace9-c0e40aed5f6d nodeName:}" failed. No retries permitted until 2025-10-06 06:48:00.126971793 +0000 UTC m=+164.641712791 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs") pod "network-metrics-daemon-4l7qj" (UID: "f80a2f04-a041-4acb-ace9-c0e40aed5f6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 06:46:56 crc kubenswrapper[4845]: I1006 06:46:56.226583 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:56 crc kubenswrapper[4845]: I1006 06:46:56.226583 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:56 crc kubenswrapper[4845]: E1006 06:46:56.227238 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:56 crc kubenswrapper[4845]: I1006 06:46:56.227289 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:56 crc kubenswrapper[4845]: I1006 06:46:56.227317 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:56 crc kubenswrapper[4845]: E1006 06:46:56.227461 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:46:56 crc kubenswrapper[4845]: E1006 06:46:56.227804 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:56 crc kubenswrapper[4845]: E1006 06:46:56.227949 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:58 crc kubenswrapper[4845]: I1006 06:46:58.226554 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:46:58 crc kubenswrapper[4845]: I1006 06:46:58.226565 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:46:58 crc kubenswrapper[4845]: E1006 06:46:58.226703 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:46:58 crc kubenswrapper[4845]: E1006 06:46:58.226816 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:46:58 crc kubenswrapper[4845]: I1006 06:46:58.227417 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:46:58 crc kubenswrapper[4845]: I1006 06:46:58.227495 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:46:58 crc kubenswrapper[4845]: E1006 06:46:58.227605 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:46:58 crc kubenswrapper[4845]: E1006 06:46:58.227662 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:00 crc kubenswrapper[4845]: I1006 06:47:00.225986 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:00 crc kubenswrapper[4845]: I1006 06:47:00.226085 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:00 crc kubenswrapper[4845]: E1006 06:47:00.226198 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:00 crc kubenswrapper[4845]: I1006 06:47:00.226506 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:00 crc kubenswrapper[4845]: I1006 06:47:00.226549 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:00 crc kubenswrapper[4845]: E1006 06:47:00.226638 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:00 crc kubenswrapper[4845]: E1006 06:47:00.226853 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:00 crc kubenswrapper[4845]: E1006 06:47:00.227070 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:02 crc kubenswrapper[4845]: I1006 06:47:02.226179 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:02 crc kubenswrapper[4845]: E1006 06:47:02.226432 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:02 crc kubenswrapper[4845]: I1006 06:47:02.227757 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:02 crc kubenswrapper[4845]: I1006 06:47:02.227792 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:02 crc kubenswrapper[4845]: I1006 06:47:02.227784 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:02 crc kubenswrapper[4845]: E1006 06:47:02.227955 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:02 crc kubenswrapper[4845]: E1006 06:47:02.228161 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:02 crc kubenswrapper[4845]: I1006 06:47:02.228290 4845 scope.go:117] "RemoveContainer" containerID="87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce" Oct 06 06:47:02 crc kubenswrapper[4845]: E1006 06:47:02.228362 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:02 crc kubenswrapper[4845]: E1006 06:47:02.228677 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-587xc_openshift-ovn-kubernetes(58772108-964d-4d0c-90a4-70ad5fe1da2d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" Oct 06 06:47:04 crc kubenswrapper[4845]: I1006 06:47:04.226427 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:04 crc kubenswrapper[4845]: I1006 06:47:04.226352 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:04 crc kubenswrapper[4845]: E1006 06:47:04.226628 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:04 crc kubenswrapper[4845]: I1006 06:47:04.226816 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:04 crc kubenswrapper[4845]: E1006 06:47:04.226873 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:04 crc kubenswrapper[4845]: E1006 06:47:04.227160 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:04 crc kubenswrapper[4845]: I1006 06:47:04.227804 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:04 crc kubenswrapper[4845]: E1006 06:47:04.228074 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:06 crc kubenswrapper[4845]: I1006 06:47:06.225998 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:06 crc kubenswrapper[4845]: I1006 06:47:06.226021 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:06 crc kubenswrapper[4845]: I1006 06:47:06.226110 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:06 crc kubenswrapper[4845]: I1006 06:47:06.227602 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:06 crc kubenswrapper[4845]: E1006 06:47:06.227584 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:06 crc kubenswrapper[4845]: E1006 06:47:06.227787 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:06 crc kubenswrapper[4845]: E1006 06:47:06.227852 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:06 crc kubenswrapper[4845]: E1006 06:47:06.227934 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:08 crc kubenswrapper[4845]: I1006 06:47:08.226336 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:08 crc kubenswrapper[4845]: I1006 06:47:08.226367 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:08 crc kubenswrapper[4845]: I1006 06:47:08.226407 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:08 crc kubenswrapper[4845]: I1006 06:47:08.226307 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:08 crc kubenswrapper[4845]: E1006 06:47:08.226520 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:08 crc kubenswrapper[4845]: E1006 06:47:08.226590 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:08 crc kubenswrapper[4845]: E1006 06:47:08.226632 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:08 crc kubenswrapper[4845]: E1006 06:47:08.226711 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:10 crc kubenswrapper[4845]: I1006 06:47:10.225930 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:10 crc kubenswrapper[4845]: I1006 06:47:10.225931 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:10 crc kubenswrapper[4845]: I1006 06:47:10.226003 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:10 crc kubenswrapper[4845]: E1006 06:47:10.226955 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:10 crc kubenswrapper[4845]: E1006 06:47:10.226740 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:10 crc kubenswrapper[4845]: E1006 06:47:10.227018 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:10 crc kubenswrapper[4845]: I1006 06:47:10.226102 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:10 crc kubenswrapper[4845]: E1006 06:47:10.227101 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:11 crc kubenswrapper[4845]: I1006 06:47:11.845193 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zpn9l_2080026c-9eee-4863-b62d-e9ce4d4525dd/kube-multus/1.log" Oct 06 06:47:11 crc kubenswrapper[4845]: I1006 06:47:11.845678 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zpn9l_2080026c-9eee-4863-b62d-e9ce4d4525dd/kube-multus/0.log" Oct 06 06:47:11 crc kubenswrapper[4845]: I1006 06:47:11.845730 4845 generic.go:334] "Generic (PLEG): container finished" podID="2080026c-9eee-4863-b62d-e9ce4d4525dd" containerID="a5fb957ec713b15c373dd40160d79fa6038407e541c2603db92c1fa4f6e96959" exitCode=1 Oct 06 06:47:11 crc kubenswrapper[4845]: I1006 06:47:11.845765 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zpn9l" event={"ID":"2080026c-9eee-4863-b62d-e9ce4d4525dd","Type":"ContainerDied","Data":"a5fb957ec713b15c373dd40160d79fa6038407e541c2603db92c1fa4f6e96959"} Oct 06 06:47:11 crc kubenswrapper[4845]: I1006 06:47:11.845810 4845 scope.go:117] "RemoveContainer" containerID="8fd1ed93716049e48a00893b3366a179647ba39e3591b151c6eaa596e8629992" Oct 06 06:47:11 crc kubenswrapper[4845]: I1006 06:47:11.846146 4845 scope.go:117] "RemoveContainer" containerID="a5fb957ec713b15c373dd40160d79fa6038407e541c2603db92c1fa4f6e96959" Oct 06 06:47:11 crc kubenswrapper[4845]: E1006 06:47:11.846296 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zpn9l_openshift-multus(2080026c-9eee-4863-b62d-e9ce4d4525dd)\"" pod="openshift-multus/multus-zpn9l" podUID="2080026c-9eee-4863-b62d-e9ce4d4525dd" Oct 06 06:47:11 crc kubenswrapper[4845]: I1006 06:47:11.867912 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=16.867841143 podStartE2EDuration="16.867841143s" podCreationTimestamp="2025-10-06 06:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:46:56.277088593 +0000 UTC m=+100.791829641" watchObservedRunningTime="2025-10-06 06:47:11.867841143 +0000 UTC m=+116.382582231" Oct 06 06:47:12 crc kubenswrapper[4845]: I1006 06:47:12.226084 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:12 crc kubenswrapper[4845]: I1006 06:47:12.226183 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:12 crc kubenswrapper[4845]: I1006 06:47:12.226095 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:12 crc kubenswrapper[4845]: E1006 06:47:12.226292 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:12 crc kubenswrapper[4845]: E1006 06:47:12.226421 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:12 crc kubenswrapper[4845]: E1006 06:47:12.226543 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:12 crc kubenswrapper[4845]: I1006 06:47:12.226652 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:12 crc kubenswrapper[4845]: E1006 06:47:12.226816 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:12 crc kubenswrapper[4845]: I1006 06:47:12.850969 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zpn9l_2080026c-9eee-4863-b62d-e9ce4d4525dd/kube-multus/1.log" Oct 06 06:47:13 crc kubenswrapper[4845]: I1006 06:47:13.226426 4845 scope.go:117] "RemoveContainer" containerID="87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce" Oct 06 06:47:13 crc kubenswrapper[4845]: I1006 06:47:13.857260 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/3.log" Oct 06 06:47:13 crc kubenswrapper[4845]: I1006 06:47:13.861187 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerStarted","Data":"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177"} Oct 06 06:47:13 crc kubenswrapper[4845]: I1006 06:47:13.861861 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:47:13 crc kubenswrapper[4845]: I1006 06:47:13.888535 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podStartSLOduration=97.888511499 podStartE2EDuration="1m37.888511499s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:13.888443817 +0000 UTC m=+118.403184875" watchObservedRunningTime="2025-10-06 06:47:13.888511499 +0000 UTC m=+118.403252547" Oct 06 06:47:14 crc kubenswrapper[4845]: I1006 06:47:14.007475 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4l7qj"] Oct 06 06:47:14 crc kubenswrapper[4845]: I1006 06:47:14.007581 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:14 crc kubenswrapper[4845]: E1006 06:47:14.007668 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:14 crc kubenswrapper[4845]: I1006 06:47:14.226813 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:14 crc kubenswrapper[4845]: I1006 06:47:14.226872 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:14 crc kubenswrapper[4845]: I1006 06:47:14.226968 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:14 crc kubenswrapper[4845]: E1006 06:47:14.226972 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:14 crc kubenswrapper[4845]: E1006 06:47:14.227090 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:14 crc kubenswrapper[4845]: E1006 06:47:14.227189 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:15 crc kubenswrapper[4845]: I1006 06:47:15.226783 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:15 crc kubenswrapper[4845]: E1006 06:47:15.226983 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:16 crc kubenswrapper[4845]: E1006 06:47:16.222865 4845 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 06 06:47:16 crc kubenswrapper[4845]: I1006 06:47:16.226629 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:16 crc kubenswrapper[4845]: I1006 06:47:16.226653 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:16 crc kubenswrapper[4845]: E1006 06:47:16.226858 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:16 crc kubenswrapper[4845]: E1006 06:47:16.227474 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:16 crc kubenswrapper[4845]: I1006 06:47:16.227317 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:16 crc kubenswrapper[4845]: E1006 06:47:16.227700 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:16 crc kubenswrapper[4845]: E1006 06:47:16.364658 4845 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 06:47:17 crc kubenswrapper[4845]: I1006 06:47:17.226559 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:17 crc kubenswrapper[4845]: E1006 06:47:17.227051 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:18 crc kubenswrapper[4845]: I1006 06:47:18.226279 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:18 crc kubenswrapper[4845]: I1006 06:47:18.226420 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:18 crc kubenswrapper[4845]: E1006 06:47:18.226490 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:18 crc kubenswrapper[4845]: I1006 06:47:18.226508 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:18 crc kubenswrapper[4845]: E1006 06:47:18.226690 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:18 crc kubenswrapper[4845]: E1006 06:47:18.226793 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:18 crc kubenswrapper[4845]: I1006 06:47:18.558085 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:47:19 crc kubenswrapper[4845]: I1006 06:47:19.225796 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:19 crc kubenswrapper[4845]: E1006 06:47:19.225909 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:20 crc kubenswrapper[4845]: I1006 06:47:20.226112 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:20 crc kubenswrapper[4845]: I1006 06:47:20.226167 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:20 crc kubenswrapper[4845]: I1006 06:47:20.226134 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:20 crc kubenswrapper[4845]: E1006 06:47:20.226570 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:20 crc kubenswrapper[4845]: E1006 06:47:20.226689 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:20 crc kubenswrapper[4845]: E1006 06:47:20.226772 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:21 crc kubenswrapper[4845]: I1006 06:47:21.226233 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:21 crc kubenswrapper[4845]: E1006 06:47:21.226438 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:21 crc kubenswrapper[4845]: E1006 06:47:21.366067 4845 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 06:47:22 crc kubenswrapper[4845]: I1006 06:47:22.226203 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:22 crc kubenswrapper[4845]: I1006 06:47:22.226268 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:22 crc kubenswrapper[4845]: E1006 06:47:22.226315 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:22 crc kubenswrapper[4845]: I1006 06:47:22.226332 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:22 crc kubenswrapper[4845]: E1006 06:47:22.226468 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:22 crc kubenswrapper[4845]: E1006 06:47:22.226557 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:23 crc kubenswrapper[4845]: I1006 06:47:23.226554 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:23 crc kubenswrapper[4845]: E1006 06:47:23.226720 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:24 crc kubenswrapper[4845]: I1006 06:47:24.226266 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:24 crc kubenswrapper[4845]: I1006 06:47:24.226317 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:24 crc kubenswrapper[4845]: E1006 06:47:24.226564 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:24 crc kubenswrapper[4845]: E1006 06:47:24.226662 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:24 crc kubenswrapper[4845]: I1006 06:47:24.226875 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:24 crc kubenswrapper[4845]: E1006 06:47:24.227091 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:25 crc kubenswrapper[4845]: I1006 06:47:25.226285 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:25 crc kubenswrapper[4845]: E1006 06:47:25.226521 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:26 crc kubenswrapper[4845]: I1006 06:47:26.226485 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:26 crc kubenswrapper[4845]: I1006 06:47:26.228106 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:26 crc kubenswrapper[4845]: E1006 06:47:26.228105 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:26 crc kubenswrapper[4845]: I1006 06:47:26.228152 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:26 crc kubenswrapper[4845]: E1006 06:47:26.228218 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:26 crc kubenswrapper[4845]: E1006 06:47:26.228293 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:26 crc kubenswrapper[4845]: I1006 06:47:26.228941 4845 scope.go:117] "RemoveContainer" containerID="a5fb957ec713b15c373dd40160d79fa6038407e541c2603db92c1fa4f6e96959" Oct 06 06:47:26 crc kubenswrapper[4845]: E1006 06:47:26.367916 4845 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 06:47:26 crc kubenswrapper[4845]: I1006 06:47:26.901970 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zpn9l_2080026c-9eee-4863-b62d-e9ce4d4525dd/kube-multus/1.log" Oct 06 06:47:26 crc kubenswrapper[4845]: I1006 06:47:26.902037 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zpn9l" event={"ID":"2080026c-9eee-4863-b62d-e9ce4d4525dd","Type":"ContainerStarted","Data":"cce252b8a800605895c878ef155f99e6ad73ee5ff140006ed993f2d0d046df30"} Oct 06 06:47:27 crc kubenswrapper[4845]: I1006 06:47:27.226575 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:27 crc kubenswrapper[4845]: E1006 06:47:27.226779 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:28 crc kubenswrapper[4845]: I1006 06:47:28.225791 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:28 crc kubenswrapper[4845]: E1006 06:47:28.225958 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:28 crc kubenswrapper[4845]: I1006 06:47:28.226229 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:28 crc kubenswrapper[4845]: E1006 06:47:28.226444 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:28 crc kubenswrapper[4845]: I1006 06:47:28.226723 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:28 crc kubenswrapper[4845]: E1006 06:47:28.226848 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:29 crc kubenswrapper[4845]: I1006 06:47:29.226217 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:29 crc kubenswrapper[4845]: E1006 06:47:29.226452 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:30 crc kubenswrapper[4845]: I1006 06:47:30.226670 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:30 crc kubenswrapper[4845]: I1006 06:47:30.226689 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:30 crc kubenswrapper[4845]: I1006 06:47:30.226782 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:30 crc kubenswrapper[4845]: E1006 06:47:30.226892 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 06:47:30 crc kubenswrapper[4845]: E1006 06:47:30.227161 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 06:47:30 crc kubenswrapper[4845]: E1006 06:47:30.227249 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 06:47:31 crc kubenswrapper[4845]: I1006 06:47:31.226595 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:31 crc kubenswrapper[4845]: E1006 06:47:31.226787 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4l7qj" podUID="f80a2f04-a041-4acb-ace9-c0e40aed5f6d" Oct 06 06:47:32 crc kubenswrapper[4845]: I1006 06:47:32.226315 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:32 crc kubenswrapper[4845]: I1006 06:47:32.226324 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:32 crc kubenswrapper[4845]: I1006 06:47:32.226475 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:32 crc kubenswrapper[4845]: I1006 06:47:32.228781 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 06 06:47:32 crc kubenswrapper[4845]: I1006 06:47:32.228808 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 06 06:47:32 crc kubenswrapper[4845]: I1006 06:47:32.229358 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 06 06:47:32 crc kubenswrapper[4845]: I1006 06:47:32.229463 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 06 06:47:33 crc kubenswrapper[4845]: I1006 06:47:33.225978 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:47:33 crc kubenswrapper[4845]: I1006 06:47:33.227863 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 06 06:47:33 crc kubenswrapper[4845]: I1006 06:47:33.229158 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 06 06:47:35 crc kubenswrapper[4845]: I1006 06:47:35.961746 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.037731 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9n6tj"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.038815 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.042093 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-v6dlz"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.042406 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.044486 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.044846 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.045343 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-v6457"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.045649 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.045829 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.045872 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.045892 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.046141 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.046184 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.046947 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.048060 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.048163 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.048208 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.048222 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.048429 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.048458 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.048752 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.049190 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-f7jz6"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.049844 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.050289 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.050621 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.053348 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.053636 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.053691 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.053829 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.053942 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.054632 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.054747 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.054952 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.055232 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.056216 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.063772 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kb4gk"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.064523 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.065698 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.067341 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.067995 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.068274 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.070474 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.071253 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.078624 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.078702 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.078623 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.079325 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.079528 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.079616 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-47lzz"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.079782 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.079964 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.080083 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.080151 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-47lzz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.080229 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.080250 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-w8s9q"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.080349 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.080480 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.080586 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.080655 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.081708 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.081738 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.082074 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.082091 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.084318 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.084592 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.100316 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.101657 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hmpsg"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.115591 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.116086 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.116433 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.116799 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.117666 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.118101 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.118202 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.120348 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.125771 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bhflm"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.126673 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.127062 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.127263 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.127473 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.127601 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.127646 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.127780 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.127862 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.128079 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.127820 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.128087 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.127093 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.128201 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.128409 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.128247 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.128541 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.128603 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.128623 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.128737 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.128880 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.129072 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.129216 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.129548 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.131580 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.131718 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.133009 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.133498 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.133609 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.137335 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.139178 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.139939 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.141682 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.142683 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.143574 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.144867 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.145014 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.145125 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.145243 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.145443 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.145593 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.145723 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.148326 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.149557 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.149606 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.151676 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.152243 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.152804 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wxrw4"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.153034 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.154194 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.155474 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.158279 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.154227 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.159409 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7pmm8"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.159499 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.155541 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wxrw4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.169248 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.154264 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.154295 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.154557 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.154611 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.169813 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.171301 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jrr8k"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173072 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-config\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173105 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25e6cec0-1b44-4424-b931-81b30b582922-serving-cert\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173131 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e-images\") pod \"machine-api-operator-5694c8668f-v6457\" (UID: \"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173152 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-config\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173170 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnkp6\" (UniqueName: \"kubernetes.io/projected/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-kube-api-access-lnkp6\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173191 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hd9m\" (UniqueName: \"kubernetes.io/projected/687eb311-8e3f-424c-8adf-b6637e656585-kube-api-access-2hd9m\") pod \"console-operator-58897d9998-kb4gk\" (UID: \"687eb311-8e3f-424c-8adf-b6637e656585\") " pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173214 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173234 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-client-ca\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173252 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-etcd-client\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173269 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-oauth-serving-cert\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173288 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p92l2\" (UniqueName: \"kubernetes.io/projected/c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e-kube-api-access-p92l2\") pod \"machine-api-operator-5694c8668f-v6457\" (UID: \"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173309 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-v6457\" (UID: \"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173326 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-oauth-config\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173342 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdtl6\" (UniqueName: \"kubernetes.io/projected/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-kube-api-access-xdtl6\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173362 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7fmg\" (UniqueName: \"kubernetes.io/projected/30d4a564-46dd-46fa-8ea4-d2b8166f3b89-kube-api-access-r7fmg\") pod \"openshift-apiserver-operator-796bbdcf4f-bkqrp\" (UID: \"30d4a564-46dd-46fa-8ea4-d2b8166f3b89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173397 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-image-import-ca\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173453 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bc118c5d-7578-4c43-9a17-ffd83062571d-encryption-config\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173474 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-registry-certificates\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173509 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-audit\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173585 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30d4a564-46dd-46fa-8ea4-d2b8166f3b89-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bkqrp\" (UID: \"30d4a564-46dd-46fa-8ea4-d2b8166f3b89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173608 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-etcd-ca\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173630 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30d4a564-46dd-46fa-8ea4-d2b8166f3b89-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bkqrp\" (UID: \"30d4a564-46dd-46fa-8ea4-d2b8166f3b89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.173650 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nqkc\" (UniqueName: \"kubernetes.io/projected/feb76755-71e9-4b98-b6f6-f6961e84f273-kube-api-access-6nqkc\") pod \"cluster-samples-operator-665b6dd947-qpxz6\" (UID: \"feb76755-71e9-4b98-b6f6-f6961e84f273\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.177909 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24vh9\" (UniqueName: \"kubernetes.io/projected/25e6cec0-1b44-4424-b931-81b30b582922-kube-api-access-24vh9\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.177953 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-serving-cert\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.177977 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc118c5d-7578-4c43-9a17-ffd83062571d-node-pullsecrets\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178005 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-etcd-serving-ca\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178030 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-serving-cert\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178052 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-config\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178115 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-etcd-service-ca\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178138 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-trusted-ca-bundle\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178164 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj8nq\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-kube-api-access-cj8nq\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178182 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bc118c5d-7578-4c43-9a17-ffd83062571d-etcd-client\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178201 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178223 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e-config\") pod \"machine-api-operator-5694c8668f-v6457\" (UID: \"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178248 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxdlr\" (UniqueName: \"kubernetes.io/projected/b245e98d-5e97-4ab4-b35c-044899fab150-kube-api-access-fxdlr\") pod \"route-controller-manager-6576b87f9c-wh5lq\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178273 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/687eb311-8e3f-424c-8adf-b6637e656585-serving-cert\") pod \"console-operator-58897d9998-kb4gk\" (UID: \"687eb311-8e3f-424c-8adf-b6637e656585\") " pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178295 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-service-ca\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178316 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feb76755-71e9-4b98-b6f6-f6961e84f273-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qpxz6\" (UID: \"feb76755-71e9-4b98-b6f6-f6961e84f273\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178340 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxzxh\" (UniqueName: \"kubernetes.io/projected/d5c71177-1d9b-4f85-9431-1a4c421281ce-kube-api-access-wxzxh\") pod \"downloads-7954f5f757-47lzz\" (UID: \"d5c71177-1d9b-4f85-9431-1a4c421281ce\") " pod="openshift-console/downloads-7954f5f757-47lzz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.174421 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.174532 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.174585 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.178360 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b245e98d-5e97-4ab4-b35c-044899fab150-config\") pod \"route-controller-manager-6576b87f9c-wh5lq\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.180403 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b245e98d-5e97-4ab4-b35c-044899fab150-serving-cert\") pod \"route-controller-manager-6576b87f9c-wh5lq\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.180497 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-trusted-ca\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.180579 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-config\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.180676 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b245e98d-5e97-4ab4-b35c-044899fab150-client-ca\") pod \"route-controller-manager-6576b87f9c-wh5lq\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.180759 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bc118c5d-7578-4c43-9a17-ffd83062571d-audit-dir\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.180831 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.180907 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.181101 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.181206 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc118c5d-7578-4c43-9a17-ffd83062571d-serving-cert\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.181286 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-registry-tls\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.181352 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-bound-sa-token\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.181551 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9fhj\" (UniqueName: \"kubernetes.io/projected/bc118c5d-7578-4c43-9a17-ffd83062571d-kube-api-access-g9fhj\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.181619 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687eb311-8e3f-424c-8adf-b6637e656585-config\") pod \"console-operator-58897d9998-kb4gk\" (UID: \"687eb311-8e3f-424c-8adf-b6637e656585\") " pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.181688 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/687eb311-8e3f-424c-8adf-b6637e656585-trusted-ca\") pod \"console-operator-58897d9998-kb4gk\" (UID: \"687eb311-8e3f-424c-8adf-b6637e656585\") " pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.196505 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.200495 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.200875 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:36.700855187 +0000 UTC m=+141.215596185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.201167 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.201784 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.202026 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.203198 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.205684 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.208005 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.208543 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.209026 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.209446 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.209503 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.210122 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4sh9g"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.210154 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.210922 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4sh9g" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.211170 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lljhd"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.211700 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.212183 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-78skb"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.212559 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.213208 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.213566 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.215643 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.216146 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.216553 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.216902 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.217106 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.217233 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.217925 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.218932 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.219408 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.220132 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.221128 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.221279 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.221532 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kq4kh"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.221707 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.221963 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.222543 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lctgg"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.223102 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.224072 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.225291 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v6dlz"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.234481 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-47lzz"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.234514 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-w8s9q"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.234524 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9n6tj"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.234533 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.234542 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hmpsg"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.234551 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.234559 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kb4gk"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.235701 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-26wxt"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.236446 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.236680 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-f7jz6"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.238733 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4sh9g"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.239268 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.249616 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.261734 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.261949 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.266232 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bhflm"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.268644 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lljhd"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.272703 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.275504 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.278023 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.283697 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.283816 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.283882 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.283942 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wxrw4"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.280619 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285439 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285568 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jbqg\" (UniqueName: \"kubernetes.io/projected/3461c091-3cf2-436e-a4fc-cbfe52d71d45-kube-api-access-6jbqg\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285570 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285591 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e810fd52-7506-4c72-98b3-97c6d1757550-metrics-tls\") pod \"dns-operator-744455d44c-lctgg\" (UID: \"e810fd52-7506-4c72-98b3-97c6d1757550\") " pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285609 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/098443e0-56cc-4df4-81d4-0dbd6894934e-signing-key\") pod \"service-ca-9c57cc56f-kq4kh\" (UID: \"098443e0-56cc-4df4-81d4-0dbd6894934e\") " pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285664 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e4309f67-a2c8-4347-ac7f-95b0e9e43317-audit-policies\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285683 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eeeca672-ad97-48c4-b789-1393b16a0ec8-auth-proxy-config\") pod \"machine-approver-56656f9798-72mf4\" (UID: \"eeeca672-ad97-48c4-b789-1393b16a0ec8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.285715 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:36.785695912 +0000 UTC m=+141.300436920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285786 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2095ab7-406e-40fe-84b1-72bac447e2c9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-89k76\" (UID: \"e2095ab7-406e-40fe-84b1-72bac447e2c9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285841 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc118c5d-7578-4c43-9a17-ffd83062571d-node-pullsecrets\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285884 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-serving-cert\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285922 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/38106f1d-b5d4-4d89-b79b-2a6173fe76c2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-g4k48\" (UID: \"38106f1d-b5d4-4d89-b79b-2a6173fe76c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285932 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc118c5d-7578-4c43-9a17-ffd83062571d-node-pullsecrets\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285941 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a261240-d054-4d68-94d3-d0c675cfdde5-serving-cert\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.285987 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2wnd\" (UniqueName: \"kubernetes.io/projected/82e84adb-480c-4357-aa9c-a92e1913f386-kube-api-access-g2wnd\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286008 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-config\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286059 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zctf\" (UniqueName: \"kubernetes.io/projected/c465f78d-6eef-4782-b7d3-844c814fadea-kube-api-access-6zctf\") pod \"machine-config-server-26wxt\" (UID: \"c465f78d-6eef-4782-b7d3-844c814fadea\") " pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286079 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d60904fb-5e6c-42aa-bb1b-dcd22661b23b-config\") pod \"service-ca-operator-777779d784-78skb\" (UID: \"d60904fb-5e6c-42aa-bb1b-dcd22661b23b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286112 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286140 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-etcd-service-ca\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286279 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a261240-d054-4d68-94d3-d0c675cfdde5-service-ca-bundle\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286407 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286504 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj8nq\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-kube-api-access-cj8nq\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286594 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bc118c5d-7578-4c43-9a17-ffd83062571d-etcd-client\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286680 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286703 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-config\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286816 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a5668eb-22b9-4eca-b0fa-6c53e83da118-audit-dir\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286896 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.286967 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d009d0a-a266-4988-b410-9f0b99b66f2f-config-volume\") pod \"collect-profiles-29328885-29fwd\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287048 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/791544e2-a968-422b-9c6e-db760c2c0b7a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-sdgkg\" (UID: \"791544e2-a968-422b-9c6e-db760c2c0b7a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287129 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feb76755-71e9-4b98-b6f6-f6961e84f273-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qpxz6\" (UID: \"feb76755-71e9-4b98-b6f6-f6961e84f273\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287203 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bc08931-ed88-44c4-be4f-3d38e3b2564c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cp7cw\" (UID: \"2bc08931-ed88-44c4-be4f-3d38e3b2564c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287286 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x7tj\" (UniqueName: \"kubernetes.io/projected/5d009d0a-a266-4988-b410-9f0b99b66f2f-kube-api-access-8x7tj\") pod \"collect-profiles-29328885-29fwd\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287386 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b245e98d-5e97-4ab4-b35c-044899fab150-serving-cert\") pod \"route-controller-manager-6576b87f9c-wh5lq\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287470 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4309f67-a2c8-4347-ac7f-95b0e9e43317-serving-cert\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287542 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3461c091-3cf2-436e-a4fc-cbfe52d71d45-tmpfs\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287616 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-config\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287686 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c465f78d-6eef-4782-b7d3-844c814fadea-node-bootstrap-token\") pod \"machine-config-server-26wxt\" (UID: \"c465f78d-6eef-4782-b7d3-844c814fadea\") " pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287756 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97db6a94-9cba-47b5-bccb-3653396a9b3f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287824 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bc08931-ed88-44c4-be4f-3d38e3b2564c-metrics-tls\") pod \"ingress-operator-5b745b69d9-cp7cw\" (UID: \"2bc08931-ed88-44c4-be4f-3d38e3b2564c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287899 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4309f67-a2c8-4347-ac7f-95b0e9e43317-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287973 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4f5cb42-510a-4970-b36b-9d124665b46e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pczzm\" (UID: \"a4f5cb42-510a-4970-b36b-9d124665b46e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288052 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b245e98d-5e97-4ab4-b35c-044899fab150-client-ca\") pod \"route-controller-manager-6576b87f9c-wh5lq\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288132 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bc118c5d-7578-4c43-9a17-ffd83062571d-audit-dir\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288136 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.287142 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-etcd-service-ca\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288173 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bc118c5d-7578-4c43-9a17-ffd83062571d-audit-dir\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288228 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/28776653-0824-40e8-8b6d-bd0380b34b2e-profile-collector-cert\") pod \"catalog-operator-68c6474976-jmx5m\" (UID: \"28776653-0824-40e8-8b6d-bd0380b34b2e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288303 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4309f67-a2c8-4347-ac7f-95b0e9e43317-audit-dir\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288323 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e51318d-7bd1-442b-9283-1f6a305a5a4c-proxy-tls\") pod \"machine-config-controller-84d6567774-8rm5c\" (UID: \"5e51318d-7bd1-442b-9283-1f6a305a5a4c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288346 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288363 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e51318d-7bd1-442b-9283-1f6a305a5a4c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8rm5c\" (UID: \"5e51318d-7bd1-442b-9283-1f6a305a5a4c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288413 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-audit-policies\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288432 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qg6\" (UniqueName: \"kubernetes.io/projected/4a5668eb-22b9-4eca-b0fa-6c53e83da118-kube-api-access-m5qg6\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288411 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-config\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288462 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc118c5d-7578-4c43-9a17-ffd83062571d-serving-cert\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288480 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288502 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-bound-sa-token\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288520 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687eb311-8e3f-424c-8adf-b6637e656585-config\") pod \"console-operator-58897d9998-kb4gk\" (UID: \"687eb311-8e3f-424c-8adf-b6637e656585\") " pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288538 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a54f269f-d3bd-4122-b8c0-9ee2b05d69bc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qh7xd\" (UID: \"a54f269f-d3bd-4122-b8c0-9ee2b05d69bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288554 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/82e84adb-480c-4357-aa9c-a92e1913f386-default-certificate\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288570 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d60904fb-5e6c-42aa-bb1b-dcd22661b23b-serving-cert\") pod \"service-ca-operator-777779d784-78skb\" (UID: \"d60904fb-5e6c-42aa-bb1b-dcd22661b23b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288589 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7phnq\" (UniqueName: \"kubernetes.io/projected/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-kube-api-access-7phnq\") pod \"marketplace-operator-79b997595-lljhd\" (UID: \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\") " pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288606 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3461c091-3cf2-436e-a4fc-cbfe52d71d45-apiservice-cert\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288624 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25e6cec0-1b44-4424-b931-81b30b582922-serving-cert\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288640 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63275f52-7475-4177-ad94-9d2875bf90eb-config\") pod \"kube-controller-manager-operator-78b949d7b-cbspk\" (UID: \"63275f52-7475-4177-ad94-9d2875bf90eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288663 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hd9m\" (UniqueName: \"kubernetes.io/projected/687eb311-8e3f-424c-8adf-b6637e656585-kube-api-access-2hd9m\") pod \"console-operator-58897d9998-kb4gk\" (UID: \"687eb311-8e3f-424c-8adf-b6637e656585\") " pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288682 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bc08931-ed88-44c4-be4f-3d38e3b2564c-trusted-ca\") pod \"ingress-operator-5b745b69d9-cp7cw\" (UID: \"2bc08931-ed88-44c4-be4f-3d38e3b2564c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288703 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288721 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-client-ca\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288739 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-oauth-serving-cert\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288757 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c465f78d-6eef-4782-b7d3-844c814fadea-certs\") pod \"machine-config-server-26wxt\" (UID: \"c465f78d-6eef-4782-b7d3-844c814fadea\") " pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288775 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeeca672-ad97-48c4-b789-1393b16a0ec8-config\") pod \"machine-approver-56656f9798-72mf4\" (UID: \"eeeca672-ad97-48c4-b789-1393b16a0ec8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288795 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63275f52-7475-4177-ad94-9d2875bf90eb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cbspk\" (UID: \"63275f52-7475-4177-ad94-9d2875bf90eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288813 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p92l2\" (UniqueName: \"kubernetes.io/projected/c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e-kube-api-access-p92l2\") pod \"machine-api-operator-5694c8668f-v6457\" (UID: \"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288836 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-v6457\" (UID: \"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288854 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7fmg\" (UniqueName: \"kubernetes.io/projected/30d4a564-46dd-46fa-8ea4-d2b8166f3b89-kube-api-access-r7fmg\") pod \"openshift-apiserver-operator-796bbdcf4f-bkqrp\" (UID: \"30d4a564-46dd-46fa-8ea4-d2b8166f3b89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288874 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhlw6\" (UniqueName: \"kubernetes.io/projected/6336ebe1-aab1-4666-bc13-d6a1d4feedae-kube-api-access-zhlw6\") pod \"multus-admission-controller-857f4d67dd-4sh9g\" (UID: \"6336ebe1-aab1-4666-bc13-d6a1d4feedae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4sh9g" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288890 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d7c4789-02fd-412a-9d10-b8b36250c32e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v46pw\" (UID: \"0d7c4789-02fd-412a-9d10-b8b36250c32e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288905 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a54f269f-d3bd-4122-b8c0-9ee2b05d69bc-srv-cert\") pod \"olm-operator-6b444d44fb-qh7xd\" (UID: \"a54f269f-d3bd-4122-b8c0-9ee2b05d69bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288919 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/098443e0-56cc-4df4-81d4-0dbd6894934e-signing-cabundle\") pod \"service-ca-9c57cc56f-kq4kh\" (UID: \"098443e0-56cc-4df4-81d4-0dbd6894934e\") " pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288935 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288957 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d7c4789-02fd-412a-9d10-b8b36250c32e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v46pw\" (UID: \"0d7c4789-02fd-412a-9d10-b8b36250c32e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288973 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a261240-d054-4d68-94d3-d0c675cfdde5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.288991 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lljhd\" (UID: \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\") " pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289020 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30d4a564-46dd-46fa-8ea4-d2b8166f3b89-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bkqrp\" (UID: \"30d4a564-46dd-46fa-8ea4-d2b8166f3b89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289036 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4309f67-a2c8-4347-ac7f-95b0e9e43317-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289053 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/82e84adb-480c-4357-aa9c-a92e1913f386-stats-auth\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289072 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-etcd-ca\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289087 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b1e5a3-759a-45be-bea3-f102b6603edb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-f2fng\" (UID: \"57b1e5a3-759a-45be-bea3-f102b6603edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289105 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30d4a564-46dd-46fa-8ea4-d2b8166f3b89-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bkqrp\" (UID: \"30d4a564-46dd-46fa-8ea4-d2b8166f3b89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289122 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nqkc\" (UniqueName: \"kubernetes.io/projected/feb76755-71e9-4b98-b6f6-f6961e84f273-kube-api-access-6nqkc\") pod \"cluster-samples-operator-665b6dd947-qpxz6\" (UID: \"feb76755-71e9-4b98-b6f6-f6961e84f273\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289138 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63275f52-7475-4177-ad94-9d2875bf90eb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cbspk\" (UID: \"63275f52-7475-4177-ad94-9d2875bf90eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289156 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24vh9\" (UniqueName: \"kubernetes.io/projected/25e6cec0-1b44-4424-b931-81b30b582922-kube-api-access-24vh9\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289173 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpbgk\" (UniqueName: \"kubernetes.io/projected/28776653-0824-40e8-8b6d-bd0380b34b2e-kube-api-access-rpbgk\") pod \"catalog-operator-68c6474976-jmx5m\" (UID: \"28776653-0824-40e8-8b6d-bd0380b34b2e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289190 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289207 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-serving-cert\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289223 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-etcd-serving-ca\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289243 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2095ab7-406e-40fe-84b1-72bac447e2c9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-89k76\" (UID: \"e2095ab7-406e-40fe-84b1-72bac447e2c9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289259 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qd8\" (UniqueName: \"kubernetes.io/projected/e810fd52-7506-4c72-98b3-97c6d1757550-kube-api-access-s9qd8\") pod \"dns-operator-744455d44c-lctgg\" (UID: \"e810fd52-7506-4c72-98b3-97c6d1757550\") " pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289274 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djv5d\" (UniqueName: \"kubernetes.io/projected/8a261240-d054-4d68-94d3-d0c675cfdde5-kube-api-access-djv5d\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289291 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97db6a94-9cba-47b5-bccb-3653396a9b3f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289308 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4309f67-a2c8-4347-ac7f-95b0e9e43317-etcd-client\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289325 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-trusted-ca-bundle\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289345 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e-config\") pod \"machine-api-operator-5694c8668f-v6457\" (UID: \"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289362 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxdlr\" (UniqueName: \"kubernetes.io/projected/b245e98d-5e97-4ab4-b35c-044899fab150-kube-api-access-fxdlr\") pod \"route-controller-manager-6576b87f9c-wh5lq\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289393 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/687eb311-8e3f-424c-8adf-b6637e656585-serving-cert\") pod \"console-operator-58897d9998-kb4gk\" (UID: \"687eb311-8e3f-424c-8adf-b6637e656585\") " pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289411 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b1e5a3-759a-45be-bea3-f102b6603edb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-f2fng\" (UID: \"57b1e5a3-759a-45be-bea3-f102b6603edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289409 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289427 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-service-ca\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289445 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3461c091-3cf2-436e-a4fc-cbfe52d71d45-webhook-cert\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289465 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxzxh\" (UniqueName: \"kubernetes.io/projected/d5c71177-1d9b-4f85-9431-1a4c421281ce-kube-api-access-wxzxh\") pod \"downloads-7954f5f757-47lzz\" (UID: \"d5c71177-1d9b-4f85-9431-1a4c421281ce\") " pod="openshift-console/downloads-7954f5f757-47lzz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289481 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b245e98d-5e97-4ab4-b35c-044899fab150-config\") pod \"route-controller-manager-6576b87f9c-wh5lq\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289496 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4309f67-a2c8-4347-ac7f-95b0e9e43317-encryption-config\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289513 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6336ebe1-aab1-4666-bc13-d6a1d4feedae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4sh9g\" (UID: \"6336ebe1-aab1-4666-bc13-d6a1d4feedae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4sh9g" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289532 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-trusted-ca\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289548 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lmzn\" (UniqueName: \"kubernetes.io/projected/0d7c4789-02fd-412a-9d10-b8b36250c32e-kube-api-access-4lmzn\") pod \"kube-storage-version-migrator-operator-b67b599dd-v46pw\" (UID: \"0d7c4789-02fd-412a-9d10-b8b36250c32e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289564 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2095ab7-406e-40fe-84b1-72bac447e2c9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-89k76\" (UID: \"e2095ab7-406e-40fe-84b1-72bac447e2c9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289580 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x95sh\" (UniqueName: \"kubernetes.io/projected/eeeca672-ad97-48c4-b789-1393b16a0ec8-kube-api-access-x95sh\") pod \"machine-approver-56656f9798-72mf4\" (UID: \"eeeca672-ad97-48c4-b789-1393b16a0ec8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289597 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0736e7ec-281a-4e6b-9665-531846db4828-proxy-tls\") pod \"machine-config-operator-74547568cd-6ftmh\" (UID: \"0736e7ec-281a-4e6b-9665-531846db4828\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289614 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289638 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/eeeca672-ad97-48c4-b789-1393b16a0ec8-machine-approver-tls\") pod \"machine-approver-56656f9798-72mf4\" (UID: \"eeeca672-ad97-48c4-b789-1393b16a0ec8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289653 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2x6k\" (UniqueName: \"kubernetes.io/projected/5e51318d-7bd1-442b-9283-1f6a305a5a4c-kube-api-access-r2x6k\") pod \"machine-config-controller-84d6567774-8rm5c\" (UID: \"5e51318d-7bd1-442b-9283-1f6a305a5a4c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289667 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jz5c\" (UniqueName: \"kubernetes.io/projected/098443e0-56cc-4df4-81d4-0dbd6894934e-kube-api-access-6jz5c\") pod \"service-ca-9c57cc56f-kq4kh\" (UID: \"098443e0-56cc-4df4-81d4-0dbd6894934e\") " pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289685 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a261240-d054-4d68-94d3-d0c675cfdde5-config\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289703 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289723 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289743 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e27d02e6-bcff-4a2e-a494-834fae1948c7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-777zb\" (UID: \"e27d02e6-bcff-4a2e-a494-834fae1948c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289761 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wjwx\" (UniqueName: \"kubernetes.io/projected/e27d02e6-bcff-4a2e-a494-834fae1948c7-kube-api-access-7wjwx\") pod \"package-server-manager-789f6589d5-777zb\" (UID: \"e27d02e6-bcff-4a2e-a494-834fae1948c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289777 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289920 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-registry-tls\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289948 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9fhj\" (UniqueName: \"kubernetes.io/projected/bc118c5d-7578-4c43-9a17-ffd83062571d-kube-api-access-g9fhj\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289964 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/687eb311-8e3f-424c-8adf-b6637e656585-trusted-ca\") pod \"console-operator-58897d9998-kb4gk\" (UID: \"687eb311-8e3f-424c-8adf-b6637e656585\") " pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289983 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d009d0a-a266-4988-b410-9f0b99b66f2f-secret-volume\") pod \"collect-profiles-29328885-29fwd\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.289996 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b245e98d-5e97-4ab4-b35c-044899fab150-client-ca\") pod \"route-controller-manager-6576b87f9c-wh5lq\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290000 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82e84adb-480c-4357-aa9c-a92e1913f386-metrics-certs\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290059 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-config\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290083 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zmtb\" (UniqueName: \"kubernetes.io/projected/b323714e-8f3c-4e94-8785-0ea3d1418677-kube-api-access-6zmtb\") pod \"migrator-59844c95c7-wxrw4\" (UID: \"b323714e-8f3c-4e94-8785-0ea3d1418677\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wxrw4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290103 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290127 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e-images\") pod \"machine-api-operator-5694c8668f-v6457\" (UID: \"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290144 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-config\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290164 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lljhd\" (UID: \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\") " pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290182 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fp7c\" (UniqueName: \"kubernetes.io/projected/2bc08931-ed88-44c4-be4f-3d38e3b2564c-kube-api-access-5fp7c\") pod \"ingress-operator-5b745b69d9-cp7cw\" (UID: \"2bc08931-ed88-44c4-be4f-3d38e3b2564c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290200 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4f5cb42-510a-4970-b36b-9d124665b46e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pczzm\" (UID: \"a4f5cb42-510a-4970-b36b-9d124665b46e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290219 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvmjd\" (UniqueName: \"kubernetes.io/projected/38106f1d-b5d4-4d89-b79b-2a6173fe76c2-kube-api-access-vvmjd\") pod \"control-plane-machine-set-operator-78cbb6b69f-g4k48\" (UID: \"38106f1d-b5d4-4d89-b79b-2a6173fe76c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290239 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/791544e2-a968-422b-9c6e-db760c2c0b7a-serving-cert\") pod \"openshift-config-operator-7777fb866f-sdgkg\" (UID: \"791544e2-a968-422b-9c6e-db760c2c0b7a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290263 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290284 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnkp6\" (UniqueName: \"kubernetes.io/projected/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-kube-api-access-lnkp6\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290326 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/28776653-0824-40e8-8b6d-bd0380b34b2e-srv-cert\") pod \"catalog-operator-68c6474976-jmx5m\" (UID: \"28776653-0824-40e8-8b6d-bd0380b34b2e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290352 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-etcd-client\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290392 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f5cb42-510a-4970-b36b-9d124665b46e-config\") pod \"kube-apiserver-operator-766d6c64bb-pczzm\" (UID: \"a4f5cb42-510a-4970-b36b-9d124665b46e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290414 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-oauth-config\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290432 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdtl6\" (UniqueName: \"kubernetes.io/projected/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-kube-api-access-xdtl6\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290457 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0736e7ec-281a-4e6b-9665-531846db4828-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6ftmh\" (UID: \"0736e7ec-281a-4e6b-9665-531846db4828\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290477 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-image-import-ca\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290497 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bc118c5d-7578-4c43-9a17-ffd83062571d-encryption-config\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290516 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-registry-certificates\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290534 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0736e7ec-281a-4e6b-9665-531846db4828-images\") pod \"machine-config-operator-74547568cd-6ftmh\" (UID: \"0736e7ec-281a-4e6b-9665-531846db4828\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290551 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dlw9\" (UniqueName: \"kubernetes.io/projected/57b1e5a3-759a-45be-bea3-f102b6603edb-kube-api-access-2dlw9\") pod \"openshift-controller-manager-operator-756b6f6bc6-f2fng\" (UID: \"57b1e5a3-759a-45be-bea3-f102b6603edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290581 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-audit\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290597 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56wx7\" (UniqueName: \"kubernetes.io/projected/0736e7ec-281a-4e6b-9665-531846db4828-kube-api-access-56wx7\") pod \"machine-config-operator-74547568cd-6ftmh\" (UID: \"0736e7ec-281a-4e6b-9665-531846db4828\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290613 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/97db6a94-9cba-47b5-bccb-3653396a9b3f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290629 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28gzc\" (UniqueName: \"kubernetes.io/projected/97db6a94-9cba-47b5-bccb-3653396a9b3f-kube-api-access-28gzc\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290660 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82e84adb-480c-4357-aa9c-a92e1913f386-service-ca-bundle\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290680 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcggm\" (UniqueName: \"kubernetes.io/projected/791544e2-a968-422b-9c6e-db760c2c0b7a-kube-api-access-jcggm\") pod \"openshift-config-operator-7777fb866f-sdgkg\" (UID: \"791544e2-a968-422b-9c6e-db760c2c0b7a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290696 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290716 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhqcx\" (UniqueName: \"kubernetes.io/projected/a54f269f-d3bd-4122-b8c0-9ee2b05d69bc-kube-api-access-bhqcx\") pod \"olm-operator-6b444d44fb-qh7xd\" (UID: \"a54f269f-d3bd-4122-b8c0-9ee2b05d69bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290736 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp4vr\" (UniqueName: \"kubernetes.io/projected/e4309f67-a2c8-4347-ac7f-95b0e9e43317-kube-api-access-fp4vr\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290751 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh44f\" (UniqueName: \"kubernetes.io/projected/d60904fb-5e6c-42aa-bb1b-dcd22661b23b-kube-api-access-mh44f\") pod \"service-ca-operator-777779d784-78skb\" (UID: \"d60904fb-5e6c-42aa-bb1b-dcd22661b23b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.290802 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687eb311-8e3f-424c-8adf-b6637e656585-config\") pod \"console-operator-58897d9998-kb4gk\" (UID: \"687eb311-8e3f-424c-8adf-b6637e656585\") " pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.291698 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-config\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.292296 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-image-import-ca\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.292516 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e-images\") pod \"machine-api-operator-5694c8668f-v6457\" (UID: \"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.293522 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc118c5d-7578-4c43-9a17-ffd83062571d-serving-cert\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.294003 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-client-ca\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.294234 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b245e98d-5e97-4ab4-b35c-044899fab150-serving-cert\") pod \"route-controller-manager-6576b87f9c-wh5lq\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.294360 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25e6cec0-1b44-4424-b931-81b30b582922-serving-cert\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.294587 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/feb76755-71e9-4b98-b6f6-f6961e84f273-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qpxz6\" (UID: \"feb76755-71e9-4b98-b6f6-f6961e84f273\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.294729 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-registry-certificates\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.295049 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-oauth-serving-cert\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.295190 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-audit\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.295179 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.295408 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-config\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.295915 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-etcd-ca\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.296458 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30d4a564-46dd-46fa-8ea4-d2b8166f3b89-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bkqrp\" (UID: \"30d4a564-46dd-46fa-8ea4-d2b8166f3b89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.297431 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e-config\") pod \"machine-api-operator-5694c8668f-v6457\" (UID: \"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.297596 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b245e98d-5e97-4ab4-b35c-044899fab150-config\") pod \"route-controller-manager-6576b87f9c-wh5lq\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.297810 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-v6457\" (UID: \"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.298315 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-trusted-ca-bundle\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.298808 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-v6457"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.298847 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.299163 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.299629 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-trusted-ca\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.299845 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:36.799831189 +0000 UTC m=+141.314572197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.299995 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.300389 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-serving-cert\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.301263 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bc118c5d-7578-4c43-9a17-ffd83062571d-etcd-serving-ca\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.301364 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-service-ca\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.301501 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/687eb311-8e3f-424c-8adf-b6637e656585-trusted-ca\") pod \"console-operator-58897d9998-kb4gk\" (UID: \"687eb311-8e3f-424c-8adf-b6637e656585\") " pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.302206 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.302254 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.302274 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bc118c5d-7578-4c43-9a17-ffd83062571d-etcd-client\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.303495 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-78skb"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.303939 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-serving-cert\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.304315 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lx59c"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.305314 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-g5lp9"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.305342 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-oauth-config\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.305568 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.305667 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30d4a564-46dd-46fa-8ea4-d2b8166f3b89-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bkqrp\" (UID: \"30d4a564-46dd-46fa-8ea4-d2b8166f3b89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.305907 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g5lp9" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.306363 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7pmm8"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.307592 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-registry-tls\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.307643 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.308587 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.308625 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bc118c5d-7578-4c43-9a17-ffd83062571d-encryption-config\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.309151 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-etcd-client\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.309757 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.310823 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g5lp9"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.311982 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lx59c"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.312068 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/687eb311-8e3f-424c-8adf-b6637e656585-serving-cert\") pod \"console-operator-58897d9998-kb4gk\" (UID: \"687eb311-8e3f-424c-8adf-b6637e656585\") " pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.313033 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.314230 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.314969 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kq4kh"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.315921 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lctgg"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.317900 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kf7gv"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.318777 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.319027 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kf7gv"] Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.323768 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.341220 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.360759 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.381156 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391155 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.391280 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:36.891262011 +0000 UTC m=+141.406003019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391313 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/28776653-0824-40e8-8b6d-bd0380b34b2e-srv-cert\") pod \"catalog-operator-68c6474976-jmx5m\" (UID: \"28776653-0824-40e8-8b6d-bd0380b34b2e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391337 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f5cb42-510a-4970-b36b-9d124665b46e-config\") pod \"kube-apiserver-operator-766d6c64bb-pczzm\" (UID: \"a4f5cb42-510a-4970-b36b-9d124665b46e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391361 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0736e7ec-281a-4e6b-9665-531846db4828-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6ftmh\" (UID: \"0736e7ec-281a-4e6b-9665-531846db4828\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391396 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-socket-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391420 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0736e7ec-281a-4e6b-9665-531846db4828-images\") pod \"machine-config-operator-74547568cd-6ftmh\" (UID: \"0736e7ec-281a-4e6b-9665-531846db4828\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391437 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dlw9\" (UniqueName: \"kubernetes.io/projected/57b1e5a3-759a-45be-bea3-f102b6603edb-kube-api-access-2dlw9\") pod \"openshift-controller-manager-operator-756b6f6bc6-f2fng\" (UID: \"57b1e5a3-759a-45be-bea3-f102b6603edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391453 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/97db6a94-9cba-47b5-bccb-3653396a9b3f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391471 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28gzc\" (UniqueName: \"kubernetes.io/projected/97db6a94-9cba-47b5-bccb-3653396a9b3f-kube-api-access-28gzc\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391486 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82e84adb-480c-4357-aa9c-a92e1913f386-service-ca-bundle\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391509 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56wx7\" (UniqueName: \"kubernetes.io/projected/0736e7ec-281a-4e6b-9665-531846db4828-kube-api-access-56wx7\") pod \"machine-config-operator-74547568cd-6ftmh\" (UID: \"0736e7ec-281a-4e6b-9665-531846db4828\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391525 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcggm\" (UniqueName: \"kubernetes.io/projected/791544e2-a968-422b-9c6e-db760c2c0b7a-kube-api-access-jcggm\") pod \"openshift-config-operator-7777fb866f-sdgkg\" (UID: \"791544e2-a968-422b-9c6e-db760c2c0b7a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391539 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391555 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhqcx\" (UniqueName: \"kubernetes.io/projected/a54f269f-d3bd-4122-b8c0-9ee2b05d69bc-kube-api-access-bhqcx\") pod \"olm-operator-6b444d44fb-qh7xd\" (UID: \"a54f269f-d3bd-4122-b8c0-9ee2b05d69bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391574 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp4vr\" (UniqueName: \"kubernetes.io/projected/e4309f67-a2c8-4347-ac7f-95b0e9e43317-kube-api-access-fp4vr\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391589 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh44f\" (UniqueName: \"kubernetes.io/projected/d60904fb-5e6c-42aa-bb1b-dcd22661b23b-kube-api-access-mh44f\") pod \"service-ca-operator-777779d784-78skb\" (UID: \"d60904fb-5e6c-42aa-bb1b-dcd22661b23b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391606 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jbqg\" (UniqueName: \"kubernetes.io/projected/3461c091-3cf2-436e-a4fc-cbfe52d71d45-kube-api-access-6jbqg\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391620 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e810fd52-7506-4c72-98b3-97c6d1757550-metrics-tls\") pod \"dns-operator-744455d44c-lctgg\" (UID: \"e810fd52-7506-4c72-98b3-97c6d1757550\") " pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391634 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/098443e0-56cc-4df4-81d4-0dbd6894934e-signing-key\") pod \"service-ca-9c57cc56f-kq4kh\" (UID: \"098443e0-56cc-4df4-81d4-0dbd6894934e\") " pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391662 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e4309f67-a2c8-4347-ac7f-95b0e9e43317-audit-policies\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391679 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eeeca672-ad97-48c4-b789-1393b16a0ec8-auth-proxy-config\") pod \"machine-approver-56656f9798-72mf4\" (UID: \"eeeca672-ad97-48c4-b789-1393b16a0ec8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391695 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2095ab7-406e-40fe-84b1-72bac447e2c9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-89k76\" (UID: \"e2095ab7-406e-40fe-84b1-72bac447e2c9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391712 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/38106f1d-b5d4-4d89-b79b-2a6173fe76c2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-g4k48\" (UID: \"38106f1d-b5d4-4d89-b79b-2a6173fe76c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391728 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a261240-d054-4d68-94d3-d0c675cfdde5-serving-cert\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391746 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2wnd\" (UniqueName: \"kubernetes.io/projected/82e84adb-480c-4357-aa9c-a92e1913f386-kube-api-access-g2wnd\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391762 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-mountpoint-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391778 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zctf\" (UniqueName: \"kubernetes.io/projected/c465f78d-6eef-4782-b7d3-844c814fadea-kube-api-access-6zctf\") pod \"machine-config-server-26wxt\" (UID: \"c465f78d-6eef-4782-b7d3-844c814fadea\") " pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391793 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d60904fb-5e6c-42aa-bb1b-dcd22661b23b-config\") pod \"service-ca-operator-777779d784-78skb\" (UID: \"d60904fb-5e6c-42aa-bb1b-dcd22661b23b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391808 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391866 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a261240-d054-4d68-94d3-d0c675cfdde5-service-ca-bundle\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391882 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391908 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a5668eb-22b9-4eca-b0fa-6c53e83da118-audit-dir\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391937 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391959 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/791544e2-a968-422b-9c6e-db760c2c0b7a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-sdgkg\" (UID: \"791544e2-a968-422b-9c6e-db760c2c0b7a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391962 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0736e7ec-281a-4e6b-9665-531846db4828-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6ftmh\" (UID: \"0736e7ec-281a-4e6b-9665-531846db4828\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.391977 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d009d0a-a266-4988-b410-9f0b99b66f2f-config-volume\") pod \"collect-profiles-29328885-29fwd\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392005 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bc08931-ed88-44c4-be4f-3d38e3b2564c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cp7cw\" (UID: \"2bc08931-ed88-44c4-be4f-3d38e3b2564c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392019 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0736e7ec-281a-4e6b-9665-531846db4828-images\") pod \"machine-config-operator-74547568cd-6ftmh\" (UID: \"0736e7ec-281a-4e6b-9665-531846db4828\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392025 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x7tj\" (UniqueName: \"kubernetes.io/projected/5d009d0a-a266-4988-b410-9f0b99b66f2f-kube-api-access-8x7tj\") pod \"collect-profiles-29328885-29fwd\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392093 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-registration-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392113 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4309f67-a2c8-4347-ac7f-95b0e9e43317-serving-cert\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392130 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3461c091-3cf2-436e-a4fc-cbfe52d71d45-tmpfs\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392149 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c465f78d-6eef-4782-b7d3-844c814fadea-node-bootstrap-token\") pod \"machine-config-server-26wxt\" (UID: \"c465f78d-6eef-4782-b7d3-844c814fadea\") " pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392177 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97db6a94-9cba-47b5-bccb-3653396a9b3f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392193 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bc08931-ed88-44c4-be4f-3d38e3b2564c-metrics-tls\") pod \"ingress-operator-5b745b69d9-cp7cw\" (UID: \"2bc08931-ed88-44c4-be4f-3d38e3b2564c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392210 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4309f67-a2c8-4347-ac7f-95b0e9e43317-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392226 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4f5cb42-510a-4970-b36b-9d124665b46e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pczzm\" (UID: \"a4f5cb42-510a-4970-b36b-9d124665b46e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392244 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/28776653-0824-40e8-8b6d-bd0380b34b2e-profile-collector-cert\") pod \"catalog-operator-68c6474976-jmx5m\" (UID: \"28776653-0824-40e8-8b6d-bd0380b34b2e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392260 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4309f67-a2c8-4347-ac7f-95b0e9e43317-audit-dir\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392275 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e51318d-7bd1-442b-9283-1f6a305a5a4c-proxy-tls\") pod \"machine-config-controller-84d6567774-8rm5c\" (UID: \"5e51318d-7bd1-442b-9283-1f6a305a5a4c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392293 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e51318d-7bd1-442b-9283-1f6a305a5a4c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8rm5c\" (UID: \"5e51318d-7bd1-442b-9283-1f6a305a5a4c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392308 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-audit-policies\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392325 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qg6\" (UniqueName: \"kubernetes.io/projected/4a5668eb-22b9-4eca-b0fa-6c53e83da118-kube-api-access-m5qg6\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392343 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392359 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a54f269f-d3bd-4122-b8c0-9ee2b05d69bc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qh7xd\" (UID: \"a54f269f-d3bd-4122-b8c0-9ee2b05d69bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392393 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/82e84adb-480c-4357-aa9c-a92e1913f386-default-certificate\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392409 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d60904fb-5e6c-42aa-bb1b-dcd22661b23b-serving-cert\") pod \"service-ca-operator-777779d784-78skb\" (UID: \"d60904fb-5e6c-42aa-bb1b-dcd22661b23b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392432 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f011803-cc1d-4dac-8bdd-d954067c3ab3-cert\") pod \"ingress-canary-g5lp9\" (UID: \"6f011803-cc1d-4dac-8bdd-d954067c3ab3\") " pod="openshift-ingress-canary/ingress-canary-g5lp9" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392449 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7phnq\" (UniqueName: \"kubernetes.io/projected/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-kube-api-access-7phnq\") pod \"marketplace-operator-79b997595-lljhd\" (UID: \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\") " pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392463 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3461c091-3cf2-436e-a4fc-cbfe52d71d45-apiservice-cert\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392480 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63275f52-7475-4177-ad94-9d2875bf90eb-config\") pod \"kube-controller-manager-operator-78b949d7b-cbspk\" (UID: \"63275f52-7475-4177-ad94-9d2875bf90eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392494 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bc08931-ed88-44c4-be4f-3d38e3b2564c-trusted-ca\") pod \"ingress-operator-5b745b69d9-cp7cw\" (UID: \"2bc08931-ed88-44c4-be4f-3d38e3b2564c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392510 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2kj8\" (UniqueName: \"kubernetes.io/projected/aea19779-3ce8-4dad-8c79-fa74016ec424-kube-api-access-z2kj8\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392531 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c465f78d-6eef-4782-b7d3-844c814fadea-certs\") pod \"machine-config-server-26wxt\" (UID: \"c465f78d-6eef-4782-b7d3-844c814fadea\") " pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392546 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeeca672-ad97-48c4-b789-1393b16a0ec8-config\") pod \"machine-approver-56656f9798-72mf4\" (UID: \"eeeca672-ad97-48c4-b789-1393b16a0ec8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392560 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63275f52-7475-4177-ad94-9d2875bf90eb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cbspk\" (UID: \"63275f52-7475-4177-ad94-9d2875bf90eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392588 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhlw6\" (UniqueName: \"kubernetes.io/projected/6336ebe1-aab1-4666-bc13-d6a1d4feedae-kube-api-access-zhlw6\") pod \"multus-admission-controller-857f4d67dd-4sh9g\" (UID: \"6336ebe1-aab1-4666-bc13-d6a1d4feedae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4sh9g" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392606 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d7c4789-02fd-412a-9d10-b8b36250c32e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v46pw\" (UID: \"0d7c4789-02fd-412a-9d10-b8b36250c32e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392624 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a54f269f-d3bd-4122-b8c0-9ee2b05d69bc-srv-cert\") pod \"olm-operator-6b444d44fb-qh7xd\" (UID: \"a54f269f-d3bd-4122-b8c0-9ee2b05d69bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392641 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/098443e0-56cc-4df4-81d4-0dbd6894934e-signing-cabundle\") pod \"service-ca-9c57cc56f-kq4kh\" (UID: \"098443e0-56cc-4df4-81d4-0dbd6894934e\") " pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392656 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392673 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60b102b-be29-4bcc-b0da-cb395cb25949-metrics-tls\") pod \"dns-default-kf7gv\" (UID: \"b60b102b-be29-4bcc-b0da-cb395cb25949\") " pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392689 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d7c4789-02fd-412a-9d10-b8b36250c32e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v46pw\" (UID: \"0d7c4789-02fd-412a-9d10-b8b36250c32e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392702 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4309f67-a2c8-4347-ac7f-95b0e9e43317-audit-dir\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.393223 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63275f52-7475-4177-ad94-9d2875bf90eb-config\") pod \"kube-controller-manager-operator-78b949d7b-cbspk\" (UID: \"63275f52-7475-4177-ad94-9d2875bf90eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.393756 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3461c091-3cf2-436e-a4fc-cbfe52d71d45-tmpfs\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.393932 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.393947 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eeeca672-ad97-48c4-b789-1393b16a0ec8-auth-proxy-config\") pod \"machine-approver-56656f9798-72mf4\" (UID: \"eeeca672-ad97-48c4-b789-1393b16a0ec8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394169 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeeca672-ad97-48c4-b789-1393b16a0ec8-config\") pod \"machine-approver-56656f9798-72mf4\" (UID: \"eeeca672-ad97-48c4-b789-1393b16a0ec8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.392704 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a261240-d054-4d68-94d3-d0c675cfdde5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394237 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lljhd\" (UID: \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\") " pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394260 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b60b102b-be29-4bcc-b0da-cb395cb25949-config-volume\") pod \"dns-default-kf7gv\" (UID: \"b60b102b-be29-4bcc-b0da-cb395cb25949\") " pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394299 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4309f67-a2c8-4347-ac7f-95b0e9e43317-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394317 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/82e84adb-480c-4357-aa9c-a92e1913f386-stats-auth\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394337 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b1e5a3-759a-45be-bea3-f102b6603edb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-f2fng\" (UID: \"57b1e5a3-759a-45be-bea3-f102b6603edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394362 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63275f52-7475-4177-ad94-9d2875bf90eb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cbspk\" (UID: \"63275f52-7475-4177-ad94-9d2875bf90eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394418 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpbgk\" (UniqueName: \"kubernetes.io/projected/28776653-0824-40e8-8b6d-bd0380b34b2e-kube-api-access-rpbgk\") pod \"catalog-operator-68c6474976-jmx5m\" (UID: \"28776653-0824-40e8-8b6d-bd0380b34b2e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394435 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394455 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-plugins-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394473 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmc6z\" (UniqueName: \"kubernetes.io/projected/6f011803-cc1d-4dac-8bdd-d954067c3ab3-kube-api-access-lmc6z\") pod \"ingress-canary-g5lp9\" (UID: \"6f011803-cc1d-4dac-8bdd-d954067c3ab3\") " pod="openshift-ingress-canary/ingress-canary-g5lp9" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394495 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2095ab7-406e-40fe-84b1-72bac447e2c9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-89k76\" (UID: \"e2095ab7-406e-40fe-84b1-72bac447e2c9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394506 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d7c4789-02fd-412a-9d10-b8b36250c32e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v46pw\" (UID: \"0d7c4789-02fd-412a-9d10-b8b36250c32e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394517 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qd8\" (UniqueName: \"kubernetes.io/projected/e810fd52-7506-4c72-98b3-97c6d1757550-kube-api-access-s9qd8\") pod \"dns-operator-744455d44c-lctgg\" (UID: \"e810fd52-7506-4c72-98b3-97c6d1757550\") " pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394538 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djv5d\" (UniqueName: \"kubernetes.io/projected/8a261240-d054-4d68-94d3-d0c675cfdde5-kube-api-access-djv5d\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394557 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97db6a94-9cba-47b5-bccb-3653396a9b3f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394573 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4309f67-a2c8-4347-ac7f-95b0e9e43317-etcd-client\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394605 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b1e5a3-759a-45be-bea3-f102b6603edb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-f2fng\" (UID: \"57b1e5a3-759a-45be-bea3-f102b6603edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394622 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3461c091-3cf2-436e-a4fc-cbfe52d71d45-webhook-cert\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394643 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4309f67-a2c8-4347-ac7f-95b0e9e43317-encryption-config\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394659 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6336ebe1-aab1-4666-bc13-d6a1d4feedae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4sh9g\" (UID: \"6336ebe1-aab1-4666-bc13-d6a1d4feedae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4sh9g" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394676 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lmzn\" (UniqueName: \"kubernetes.io/projected/0d7c4789-02fd-412a-9d10-b8b36250c32e-kube-api-access-4lmzn\") pod \"kube-storage-version-migrator-operator-b67b599dd-v46pw\" (UID: \"0d7c4789-02fd-412a-9d10-b8b36250c32e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394693 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2095ab7-406e-40fe-84b1-72bac447e2c9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-89k76\" (UID: \"e2095ab7-406e-40fe-84b1-72bac447e2c9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394711 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzsdq\" (UniqueName: \"kubernetes.io/projected/b60b102b-be29-4bcc-b0da-cb395cb25949-kube-api-access-hzsdq\") pod \"dns-default-kf7gv\" (UID: \"b60b102b-be29-4bcc-b0da-cb395cb25949\") " pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394736 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x95sh\" (UniqueName: \"kubernetes.io/projected/eeeca672-ad97-48c4-b789-1393b16a0ec8-kube-api-access-x95sh\") pod \"machine-approver-56656f9798-72mf4\" (UID: \"eeeca672-ad97-48c4-b789-1393b16a0ec8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394768 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0736e7ec-281a-4e6b-9665-531846db4828-proxy-tls\") pod \"machine-config-operator-74547568cd-6ftmh\" (UID: \"0736e7ec-281a-4e6b-9665-531846db4828\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394788 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394818 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/eeeca672-ad97-48c4-b789-1393b16a0ec8-machine-approver-tls\") pod \"machine-approver-56656f9798-72mf4\" (UID: \"eeeca672-ad97-48c4-b789-1393b16a0ec8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394840 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2x6k\" (UniqueName: \"kubernetes.io/projected/5e51318d-7bd1-442b-9283-1f6a305a5a4c-kube-api-access-r2x6k\") pod \"machine-config-controller-84d6567774-8rm5c\" (UID: \"5e51318d-7bd1-442b-9283-1f6a305a5a4c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394859 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jz5c\" (UniqueName: \"kubernetes.io/projected/098443e0-56cc-4df4-81d4-0dbd6894934e-kube-api-access-6jz5c\") pod \"service-ca-9c57cc56f-kq4kh\" (UID: \"098443e0-56cc-4df4-81d4-0dbd6894934e\") " pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394875 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a261240-d054-4d68-94d3-d0c675cfdde5-config\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394897 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394917 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e27d02e6-bcff-4a2e-a494-834fae1948c7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-777zb\" (UID: \"e27d02e6-bcff-4a2e-a494-834fae1948c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394934 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wjwx\" (UniqueName: \"kubernetes.io/projected/e27d02e6-bcff-4a2e-a494-834fae1948c7-kube-api-access-7wjwx\") pod \"package-server-manager-789f6589d5-777zb\" (UID: \"e27d02e6-bcff-4a2e-a494-834fae1948c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394949 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394948 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-audit-policies\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.395138 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.394967 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d009d0a-a266-4988-b410-9f0b99b66f2f-secret-volume\") pod \"collect-profiles-29328885-29fwd\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.395272 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82e84adb-480c-4357-aa9c-a92e1913f386-metrics-certs\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.395306 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-csi-data-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.395361 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zmtb\" (UniqueName: \"kubernetes.io/projected/b323714e-8f3c-4e94-8785-0ea3d1418677-kube-api-access-6zmtb\") pod \"migrator-59844c95c7-wxrw4\" (UID: \"b323714e-8f3c-4e94-8785-0ea3d1418677\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wxrw4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.395407 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.395438 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fp7c\" (UniqueName: \"kubernetes.io/projected/2bc08931-ed88-44c4-be4f-3d38e3b2564c-kube-api-access-5fp7c\") pod \"ingress-operator-5b745b69d9-cp7cw\" (UID: \"2bc08931-ed88-44c4-be4f-3d38e3b2564c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.395466 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4f5cb42-510a-4970-b36b-9d124665b46e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pczzm\" (UID: \"a4f5cb42-510a-4970-b36b-9d124665b46e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.395492 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvmjd\" (UniqueName: \"kubernetes.io/projected/38106f1d-b5d4-4d89-b79b-2a6173fe76c2-kube-api-access-vvmjd\") pod \"control-plane-machine-set-operator-78cbb6b69f-g4k48\" (UID: \"38106f1d-b5d4-4d89-b79b-2a6173fe76c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.395523 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lljhd\" (UID: \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\") " pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.395545 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.395550 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/791544e2-a968-422b-9c6e-db760c2c0b7a-serving-cert\") pod \"openshift-config-operator-7777fb866f-sdgkg\" (UID: \"791544e2-a968-422b-9c6e-db760c2c0b7a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.395617 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.396239 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/791544e2-a968-422b-9c6e-db760c2c0b7a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-sdgkg\" (UID: \"791544e2-a968-422b-9c6e-db760c2c0b7a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.396505 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/82e84adb-480c-4357-aa9c-a92e1913f386-default-certificate\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.396605 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97db6a94-9cba-47b5-bccb-3653396a9b3f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.396680 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e51318d-7bd1-442b-9283-1f6a305a5a4c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8rm5c\" (UID: \"5e51318d-7bd1-442b-9283-1f6a305a5a4c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.396890 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a5668eb-22b9-4eca-b0fa-6c53e83da118-audit-dir\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.397015 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:36.896998966 +0000 UTC m=+141.411740204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.397706 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.399677 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.399823 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.400061 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63275f52-7475-4177-ad94-9d2875bf90eb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cbspk\" (UID: \"63275f52-7475-4177-ad94-9d2875bf90eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.400395 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.400887 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.401131 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.401139 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.401221 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b1e5a3-759a-45be-bea3-f102b6603edb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-f2fng\" (UID: \"57b1e5a3-759a-45be-bea3-f102b6603edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.401497 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d7c4789-02fd-412a-9d10-b8b36250c32e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v46pw\" (UID: \"0d7c4789-02fd-412a-9d10-b8b36250c32e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.402719 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.403419 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b1e5a3-759a-45be-bea3-f102b6603edb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-f2fng\" (UID: \"57b1e5a3-759a-45be-bea3-f102b6603edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.403592 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.405918 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0736e7ec-281a-4e6b-9665-531846db4828-proxy-tls\") pod \"machine-config-operator-74547568cd-6ftmh\" (UID: \"0736e7ec-281a-4e6b-9665-531846db4828\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.408837 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e51318d-7bd1-442b-9283-1f6a305a5a4c-proxy-tls\") pod \"machine-config-controller-84d6567774-8rm5c\" (UID: \"5e51318d-7bd1-442b-9283-1f6a305a5a4c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.408981 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/eeeca672-ad97-48c4-b789-1393b16a0ec8-machine-approver-tls\") pod \"machine-approver-56656f9798-72mf4\" (UID: \"eeeca672-ad97-48c4-b789-1393b16a0ec8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.409229 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82e84adb-480c-4357-aa9c-a92e1913f386-metrics-certs\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.422129 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.440699 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.443573 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82e84adb-480c-4357-aa9c-a92e1913f386-service-ca-bundle\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.461694 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.468170 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/82e84adb-480c-4357-aa9c-a92e1913f386-stats-auth\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.481913 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.496971 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.497261 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-mountpoint-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.497406 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-registration-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.497502 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f011803-cc1d-4dac-8bdd-d954067c3ab3-cert\") pod \"ingress-canary-g5lp9\" (UID: \"6f011803-cc1d-4dac-8bdd-d954067c3ab3\") " pod="openshift-ingress-canary/ingress-canary-g5lp9" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.497554 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2kj8\" (UniqueName: \"kubernetes.io/projected/aea19779-3ce8-4dad-8c79-fa74016ec424-kube-api-access-z2kj8\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.497609 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60b102b-be29-4bcc-b0da-cb395cb25949-metrics-tls\") pod \"dns-default-kf7gv\" (UID: \"b60b102b-be29-4bcc-b0da-cb395cb25949\") " pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.497653 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b60b102b-be29-4bcc-b0da-cb395cb25949-config-volume\") pod \"dns-default-kf7gv\" (UID: \"b60b102b-be29-4bcc-b0da-cb395cb25949\") " pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.497719 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-plugins-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.497741 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmc6z\" (UniqueName: \"kubernetes.io/projected/6f011803-cc1d-4dac-8bdd-d954067c3ab3-kube-api-access-lmc6z\") pod \"ingress-canary-g5lp9\" (UID: \"6f011803-cc1d-4dac-8bdd-d954067c3ab3\") " pod="openshift-ingress-canary/ingress-canary-g5lp9" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.497830 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzsdq\" (UniqueName: \"kubernetes.io/projected/b60b102b-be29-4bcc-b0da-cb395cb25949-kube-api-access-hzsdq\") pod \"dns-default-kf7gv\" (UID: \"b60b102b-be29-4bcc-b0da-cb395cb25949\") " pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.497928 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-csi-data-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.498009 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-socket-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.498492 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-socket-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.498573 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:36.998556694 +0000 UTC m=+141.513297702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.498676 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-mountpoint-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.498727 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-registration-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.498987 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-plugins-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.499190 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/aea19779-3ce8-4dad-8c79-fa74016ec424-csi-data-dir\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.503602 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.520953 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.546515 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.553889 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a261240-d054-4d68-94d3-d0c675cfdde5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.562870 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.581964 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.589072 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a261240-d054-4d68-94d3-d0c675cfdde5-serving-cert\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.599743 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.600337 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.100316947 +0000 UTC m=+141.615057945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.601552 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.608050 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a261240-d054-4d68-94d3-d0c675cfdde5-config\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.622807 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.640798 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.645467 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a261240-d054-4d68-94d3-d0c675cfdde5-service-ca-bundle\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.660922 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.682106 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.700922 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.701050 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.701349 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.201329351 +0000 UTC m=+141.716070359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.701723 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.702050 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.202037838 +0000 UTC m=+141.716778836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.721027 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.725165 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bc08931-ed88-44c4-be4f-3d38e3b2564c-metrics-tls\") pod \"ingress-operator-5b745b69d9-cp7cw\" (UID: \"2bc08931-ed88-44c4-be4f-3d38e3b2564c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.741406 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.749732 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2095ab7-406e-40fe-84b1-72bac447e2c9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-89k76\" (UID: \"e2095ab7-406e-40fe-84b1-72bac447e2c9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.760969 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.791012 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.794226 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bc08931-ed88-44c4-be4f-3d38e3b2564c-trusted-ca\") pod \"ingress-operator-5b745b69d9-cp7cw\" (UID: \"2bc08931-ed88-44c4-be4f-3d38e3b2564c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.802511 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.802573 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.802813 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.302793396 +0000 UTC m=+141.817534404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.803305 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.803702 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.303690629 +0000 UTC m=+141.818431637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.808549 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2095ab7-406e-40fe-84b1-72bac447e2c9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-89k76\" (UID: \"e2095ab7-406e-40fe-84b1-72bac447e2c9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.821524 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.842139 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.845627 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4309f67-a2c8-4347-ac7f-95b0e9e43317-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.860642 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.881201 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.894593 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4309f67-a2c8-4347-ac7f-95b0e9e43317-etcd-client\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.901508 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.905110 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4309f67-a2c8-4347-ac7f-95b0e9e43317-serving-cert\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.905217 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.905348 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.405304578 +0000 UTC m=+141.920045616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.905885 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:36 crc kubenswrapper[4845]: E1006 06:47:36.906309 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.406278972 +0000 UTC m=+141.921020010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.921313 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.933845 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4309f67-a2c8-4347-ac7f-95b0e9e43317-encryption-config\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.941557 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 06 06:47:36 crc kubenswrapper[4845]: I1006 06:47:36.961636 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.000791 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.004546 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e4309f67-a2c8-4347-ac7f-95b0e9e43317-audit-policies\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.011933 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.012582 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.512528709 +0000 UTC m=+142.027269767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.022865 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.034604 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4309f67-a2c8-4347-ac7f-95b0e9e43317-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.041989 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.061457 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.080934 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.102592 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.105545 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/28776653-0824-40e8-8b6d-bd0380b34b2e-profile-collector-cert\") pod \"catalog-operator-68c6474976-jmx5m\" (UID: \"28776653-0824-40e8-8b6d-bd0380b34b2e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.107250 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a54f269f-d3bd-4122-b8c0-9ee2b05d69bc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qh7xd\" (UID: \"a54f269f-d3bd-4122-b8c0-9ee2b05d69bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.108222 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d009d0a-a266-4988-b410-9f0b99b66f2f-secret-volume\") pod \"collect-profiles-29328885-29fwd\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.115175 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.115564 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.615548924 +0000 UTC m=+142.130289932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.122014 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.128121 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/28776653-0824-40e8-8b6d-bd0380b34b2e-srv-cert\") pod \"catalog-operator-68c6474976-jmx5m\" (UID: \"28776653-0824-40e8-8b6d-bd0380b34b2e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.142520 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.146801 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/38106f1d-b5d4-4d89-b79b-2a6173fe76c2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-g4k48\" (UID: \"38106f1d-b5d4-4d89-b79b-2a6173fe76c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.162248 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.181951 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.201303 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.209660 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6336ebe1-aab1-4666-bc13-d6a1d4feedae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4sh9g\" (UID: \"6336ebe1-aab1-4666-bc13-d6a1d4feedae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4sh9g" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.216405 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.216608 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.716562067 +0000 UTC m=+142.231303075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.217298 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.217678 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.717665075 +0000 UTC m=+142.232406183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.219336 4845 request.go:700] Waited for 1.007483113s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dmarketplace-trusted-ca&limit=500&resourceVersion=0 Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.229007 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.237384 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lljhd\" (UID: \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\") " pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.241333 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.261386 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.270182 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lljhd\" (UID: \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\") " pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.281049 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.300618 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.319037 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.319326 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.819295375 +0000 UTC m=+142.334036413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.319781 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.320690 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.82066835 +0000 UTC m=+142.335409388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.321304 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.329210 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d60904fb-5e6c-42aa-bb1b-dcd22661b23b-config\") pod \"service-ca-operator-777779d784-78skb\" (UID: \"d60904fb-5e6c-42aa-bb1b-dcd22661b23b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.342624 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.361562 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.372293 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d60904fb-5e6c-42aa-bb1b-dcd22661b23b-serving-cert\") pod \"service-ca-operator-777779d784-78skb\" (UID: \"d60904fb-5e6c-42aa-bb1b-dcd22661b23b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.382925 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.391965 4845 configmap.go:193] Couldn't get configMap openshift-kube-apiserver-operator/kube-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.392032 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a4f5cb42-510a-4970-b36b-9d124665b46e-config podName:a4f5cb42-510a-4970-b36b-9d124665b46e nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.892014604 +0000 UTC m=+142.406755612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/a4f5cb42-510a-4970-b36b-9d124665b46e-config") pod "kube-apiserver-operator-766d6c64bb-pczzm" (UID: "a4f5cb42-510a-4970-b36b-9d124665b46e") : failed to sync configmap cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.392216 4845 secret.go:188] Couldn't get secret openshift-image-registry/image-registry-operator-tls: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.392249 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97db6a94-9cba-47b5-bccb-3653396a9b3f-image-registry-operator-tls podName:97db6a94-9cba-47b5-bccb-3653396a9b3f nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.892240799 +0000 UTC m=+142.406981807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/97db6a94-9cba-47b5-bccb-3653396a9b3f-image-registry-operator-tls") pod "cluster-image-registry-operator-dc59b4c8b-mltmv" (UID: "97db6a94-9cba-47b5-bccb-3653396a9b3f") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.392279 4845 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.392300 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5d009d0a-a266-4988-b410-9f0b99b66f2f-config-volume podName:5d009d0a-a266-4988-b410-9f0b99b66f2f nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.892294691 +0000 UTC m=+142.407035699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/5d009d0a-a266-4988-b410-9f0b99b66f2f-config-volume") pod "collect-profiles-29328885-29fwd" (UID: "5d009d0a-a266-4988-b410-9f0b99b66f2f") : failed to sync configmap cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393144 4845 secret.go:188] Couldn't get secret openshift-kube-apiserver-operator/kube-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393312 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4f5cb42-510a-4970-b36b-9d124665b46e-serving-cert podName:a4f5cb42-510a-4970-b36b-9d124665b46e nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.893268015 +0000 UTC m=+142.408009063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a4f5cb42-510a-4970-b36b-9d124665b46e-serving-cert") pod "kube-apiserver-operator-766d6c64bb-pczzm" (UID: "a4f5cb42-510a-4970-b36b-9d124665b46e") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393470 4845 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393528 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/098443e0-56cc-4df4-81d4-0dbd6894934e-signing-cabundle podName:098443e0-56cc-4df4-81d4-0dbd6894934e nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.893513101 +0000 UTC m=+142.408254149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/098443e0-56cc-4df4-81d4-0dbd6894934e-signing-cabundle") pod "service-ca-9c57cc56f-kq4kh" (UID: "098443e0-56cc-4df4-81d4-0dbd6894934e") : failed to sync configmap cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393560 4845 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393603 4845 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393605 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3461c091-3cf2-436e-a4fc-cbfe52d71d45-apiservice-cert podName:3461c091-3cf2-436e-a4fc-cbfe52d71d45 nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.893592974 +0000 UTC m=+142.408334012 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/3461c091-3cf2-436e-a4fc-cbfe52d71d45-apiservice-cert") pod "packageserver-d55dfcdfc-dpq5f" (UID: "3461c091-3cf2-436e-a4fc-cbfe52d71d45") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393728 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a54f269f-d3bd-4122-b8c0-9ee2b05d69bc-srv-cert podName:a54f269f-d3bd-4122-b8c0-9ee2b05d69bc nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.893698706 +0000 UTC m=+142.408439714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/a54f269f-d3bd-4122-b8c0-9ee2b05d69bc-srv-cert") pod "olm-operator-6b444d44fb-qh7xd" (UID: "a54f269f-d3bd-4122-b8c0-9ee2b05d69bc") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393751 4845 secret.go:188] Couldn't get secret openshift-dns-operator/metrics-tls: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393754 4845 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393795 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e810fd52-7506-4c72-98b3-97c6d1757550-metrics-tls podName:e810fd52-7506-4c72-98b3-97c6d1757550 nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.893787848 +0000 UTC m=+142.408528856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e810fd52-7506-4c72-98b3-97c6d1757550-metrics-tls") pod "dns-operator-744455d44c-lctgg" (UID: "e810fd52-7506-4c72-98b3-97c6d1757550") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393097 4845 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393834 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/098443e0-56cc-4df4-81d4-0dbd6894934e-signing-key podName:098443e0-56cc-4df4-81d4-0dbd6894934e nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.893810709 +0000 UTC m=+142.408551967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/098443e0-56cc-4df4-81d4-0dbd6894934e-signing-key") pod "service-ca-9c57cc56f-kq4kh" (UID: "098443e0-56cc-4df4-81d4-0dbd6894934e") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.393858 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c465f78d-6eef-4782-b7d3-844c814fadea-node-bootstrap-token podName:c465f78d-6eef-4782-b7d3-844c814fadea nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.89385137 +0000 UTC m=+142.408592378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/c465f78d-6eef-4782-b7d3-844c814fadea-node-bootstrap-token") pod "machine-config-server-26wxt" (UID: "c465f78d-6eef-4782-b7d3-844c814fadea") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.394723 4845 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.394971 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c465f78d-6eef-4782-b7d3-844c814fadea-certs podName:c465f78d-6eef-4782-b7d3-844c814fadea nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.894935497 +0000 UTC m=+142.409676535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/c465f78d-6eef-4782-b7d3-844c814fadea-certs") pod "machine-config-server-26wxt" (UID: "c465f78d-6eef-4782-b7d3-844c814fadea") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.395992 4845 secret.go:188] Couldn't get secret openshift-config-operator/config-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.396034 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/791544e2-a968-422b-9c6e-db760c2c0b7a-serving-cert podName:791544e2-a968-422b-9c6e-db760c2c0b7a nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.896024625 +0000 UTC m=+142.410765633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/791544e2-a968-422b-9c6e-db760c2c0b7a-serving-cert") pod "openshift-config-operator-7777fb866f-sdgkg" (UID: "791544e2-a968-422b-9c6e-db760c2c0b7a") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.396876 4845 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.396982 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e27d02e6-bcff-4a2e-a494-834fae1948c7-package-server-manager-serving-cert podName:e27d02e6-bcff-4a2e-a494-834fae1948c7 nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.896961299 +0000 UTC m=+142.411702337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e27d02e6-bcff-4a2e-a494-834fae1948c7-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-777zb" (UID: "e27d02e6-bcff-4a2e-a494-834fae1948c7") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.401200 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.403583 4845 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.403631 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3461c091-3cf2-436e-a4fc-cbfe52d71d45-webhook-cert podName:3461c091-3cf2-436e-a4fc-cbfe52d71d45 nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.903620727 +0000 UTC m=+142.418361725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/3461c091-3cf2-436e-a4fc-cbfe52d71d45-webhook-cert") pod "packageserver-d55dfcdfc-dpq5f" (UID: "3461c091-3cf2-436e-a4fc-cbfe52d71d45") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.421137 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.421457 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.421796 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.921762056 +0000 UTC m=+142.436503094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.423574 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.424017 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.924000962 +0000 UTC m=+142.438742000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.442140 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.461254 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.482042 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.499132 4845 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.499175 4845 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.499260 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f011803-cc1d-4dac-8bdd-d954067c3ab3-cert podName:6f011803-cc1d-4dac-8bdd-d954067c3ab3 nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.999238865 +0000 UTC m=+142.513979883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f011803-cc1d-4dac-8bdd-d954067c3ab3-cert") pod "ingress-canary-g5lp9" (UID: "6f011803-cc1d-4dac-8bdd-d954067c3ab3") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.499672 4845 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.499842 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b60b102b-be29-4bcc-b0da-cb395cb25949-metrics-tls podName:b60b102b-be29-4bcc-b0da-cb395cb25949 nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.999801849 +0000 UTC m=+142.514542897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b60b102b-be29-4bcc-b0da-cb395cb25949-metrics-tls") pod "dns-default-kf7gv" (UID: "b60b102b-be29-4bcc-b0da-cb395cb25949") : failed to sync secret cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.499925 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b60b102b-be29-4bcc-b0da-cb395cb25949-config-volume podName:b60b102b-be29-4bcc-b0da-cb395cb25949 nodeName:}" failed. No retries permitted until 2025-10-06 06:47:37.999907782 +0000 UTC m=+142.514648830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/b60b102b-be29-4bcc-b0da-cb395cb25949-config-volume") pod "dns-default-kf7gv" (UID: "b60b102b-be29-4bcc-b0da-cb395cb25949") : failed to sync configmap cache: timed out waiting for the condition Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.502950 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.522262 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.525870 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.527237 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.027211532 +0000 UTC m=+142.541952550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.541600 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.561817 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.581045 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.602405 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.622078 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.628320 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.629281 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.129246612 +0000 UTC m=+142.643987820 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.642131 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.663238 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.681183 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.701758 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.722147 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.729455 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.729661 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.22961874 +0000 UTC m=+142.744359788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.730280 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.730671 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.230655956 +0000 UTC m=+142.745396964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.741759 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.762101 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.781353 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.801438 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.820922 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.831477 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.831693 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.33166904 +0000 UTC m=+142.846410048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.832277 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.832652 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.332638264 +0000 UTC m=+142.847379282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.841422 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.861809 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.881238 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.901269 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.920797 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.933545 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.933730 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.43370701 +0000 UTC m=+142.948448018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.933815 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/098443e0-56cc-4df4-81d4-0dbd6894934e-signing-key\") pod \"service-ca-9c57cc56f-kq4kh\" (UID: \"098443e0-56cc-4df4-81d4-0dbd6894934e\") " pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.933866 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e810fd52-7506-4c72-98b3-97c6d1757550-metrics-tls\") pod \"dns-operator-744455d44c-lctgg\" (UID: \"e810fd52-7506-4c72-98b3-97c6d1757550\") " pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.933955 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d009d0a-a266-4988-b410-9f0b99b66f2f-config-volume\") pod \"collect-profiles-29328885-29fwd\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.934013 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c465f78d-6eef-4782-b7d3-844c814fadea-node-bootstrap-token\") pod \"machine-config-server-26wxt\" (UID: \"c465f78d-6eef-4782-b7d3-844c814fadea\") " pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.934034 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4f5cb42-510a-4970-b36b-9d124665b46e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pczzm\" (UID: \"a4f5cb42-510a-4970-b36b-9d124665b46e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.934112 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3461c091-3cf2-436e-a4fc-cbfe52d71d45-apiservice-cert\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.934177 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c465f78d-6eef-4782-b7d3-844c814fadea-certs\") pod \"machine-config-server-26wxt\" (UID: \"c465f78d-6eef-4782-b7d3-844c814fadea\") " pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.934215 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/098443e0-56cc-4df4-81d4-0dbd6894934e-signing-cabundle\") pod \"service-ca-9c57cc56f-kq4kh\" (UID: \"098443e0-56cc-4df4-81d4-0dbd6894934e\") " pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.934293 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a54f269f-d3bd-4122-b8c0-9ee2b05d69bc-srv-cert\") pod \"olm-operator-6b444d44fb-qh7xd\" (UID: \"a54f269f-d3bd-4122-b8c0-9ee2b05d69bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.934987 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3461c091-3cf2-436e-a4fc-cbfe52d71d45-webhook-cert\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.935134 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.935171 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e27d02e6-bcff-4a2e-a494-834fae1948c7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-777zb\" (UID: \"e27d02e6-bcff-4a2e-a494-834fae1948c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.935250 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/791544e2-a968-422b-9c6e-db760c2c0b7a-serving-cert\") pod \"openshift-config-operator-7777fb866f-sdgkg\" (UID: \"791544e2-a968-422b-9c6e-db760c2c0b7a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.935288 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f5cb42-510a-4970-b36b-9d124665b46e-config\") pod \"kube-apiserver-operator-766d6c64bb-pczzm\" (UID: \"a4f5cb42-510a-4970-b36b-9d124665b46e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.935342 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/97db6a94-9cba-47b5-bccb-3653396a9b3f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:37 crc kubenswrapper[4845]: E1006 06:47:37.935539 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.435523226 +0000 UTC m=+142.950264234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.935817 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d009d0a-a266-4988-b410-9f0b99b66f2f-config-volume\") pod \"collect-profiles-29328885-29fwd\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.937233 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f5cb42-510a-4970-b36b-9d124665b46e-config\") pod \"kube-apiserver-operator-766d6c64bb-pczzm\" (UID: \"a4f5cb42-510a-4970-b36b-9d124665b46e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.938279 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/098443e0-56cc-4df4-81d4-0dbd6894934e-signing-cabundle\") pod \"service-ca-9c57cc56f-kq4kh\" (UID: \"098443e0-56cc-4df4-81d4-0dbd6894934e\") " pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.939360 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e27d02e6-bcff-4a2e-a494-834fae1948c7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-777zb\" (UID: \"e27d02e6-bcff-4a2e-a494-834fae1948c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.939805 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/97db6a94-9cba-47b5-bccb-3653396a9b3f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.940019 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/098443e0-56cc-4df4-81d4-0dbd6894934e-signing-key\") pod \"service-ca-9c57cc56f-kq4kh\" (UID: \"098443e0-56cc-4df4-81d4-0dbd6894934e\") " pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.940364 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3461c091-3cf2-436e-a4fc-cbfe52d71d45-apiservice-cert\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.940499 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a54f269f-d3bd-4122-b8c0-9ee2b05d69bc-srv-cert\") pod \"olm-operator-6b444d44fb-qh7xd\" (UID: \"a54f269f-d3bd-4122-b8c0-9ee2b05d69bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.940973 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4f5cb42-510a-4970-b36b-9d124665b46e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pczzm\" (UID: \"a4f5cb42-510a-4970-b36b-9d124665b46e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.943502 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c465f78d-6eef-4782-b7d3-844c814fadea-certs\") pod \"machine-config-server-26wxt\" (UID: \"c465f78d-6eef-4782-b7d3-844c814fadea\") " pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.945460 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3461c091-3cf2-436e-a4fc-cbfe52d71d45-webhook-cert\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.945736 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.946308 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/791544e2-a968-422b-9c6e-db760c2c0b7a-serving-cert\") pod \"openshift-config-operator-7777fb866f-sdgkg\" (UID: \"791544e2-a968-422b-9c6e-db760c2c0b7a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.947602 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e810fd52-7506-4c72-98b3-97c6d1757550-metrics-tls\") pod \"dns-operator-744455d44c-lctgg\" (UID: \"e810fd52-7506-4c72-98b3-97c6d1757550\") " pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.950274 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c465f78d-6eef-4782-b7d3-844c814fadea-node-bootstrap-token\") pod \"machine-config-server-26wxt\" (UID: \"c465f78d-6eef-4782-b7d3-844c814fadea\") " pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.976743 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj8nq\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-kube-api-access-cj8nq\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:37 crc kubenswrapper[4845]: I1006 06:47:37.995140 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-bound-sa-token\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.013477 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hd9m\" (UniqueName: \"kubernetes.io/projected/687eb311-8e3f-424c-8adf-b6637e656585-kube-api-access-2hd9m\") pod \"console-operator-58897d9998-kb4gk\" (UID: \"687eb311-8e3f-424c-8adf-b6637e656585\") " pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.033619 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdtl6\" (UniqueName: \"kubernetes.io/projected/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-kube-api-access-xdtl6\") pod \"console-f9d7485db-v6dlz\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.036479 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.036622 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.536591501 +0000 UTC m=+143.051332549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.037162 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f011803-cc1d-4dac-8bdd-d954067c3ab3-cert\") pod \"ingress-canary-g5lp9\" (UID: \"6f011803-cc1d-4dac-8bdd-d954067c3ab3\") " pod="openshift-ingress-canary/ingress-canary-g5lp9" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.037426 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60b102b-be29-4bcc-b0da-cb395cb25949-metrics-tls\") pod \"dns-default-kf7gv\" (UID: \"b60b102b-be29-4bcc-b0da-cb395cb25949\") " pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.037532 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b60b102b-be29-4bcc-b0da-cb395cb25949-config-volume\") pod \"dns-default-kf7gv\" (UID: \"b60b102b-be29-4bcc-b0da-cb395cb25949\") " pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.038007 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.045297 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.54329195 +0000 UTC m=+143.058032968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.061625 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnkp6\" (UniqueName: \"kubernetes.io/projected/79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7-kube-api-access-lnkp6\") pod \"etcd-operator-b45778765-f7jz6\" (UID: \"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.076253 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p92l2\" (UniqueName: \"kubernetes.io/projected/c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e-kube-api-access-p92l2\") pod \"machine-api-operator-5694c8668f-v6457\" (UID: \"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.097041 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nqkc\" (UniqueName: \"kubernetes.io/projected/feb76755-71e9-4b98-b6f6-f6961e84f273-kube-api-access-6nqkc\") pod \"cluster-samples-operator-665b6dd947-qpxz6\" (UID: \"feb76755-71e9-4b98-b6f6-f6961e84f273\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.119455 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxdlr\" (UniqueName: \"kubernetes.io/projected/b245e98d-5e97-4ab4-b35c-044899fab150-kube-api-access-fxdlr\") pod \"route-controller-manager-6576b87f9c-wh5lq\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.134703 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7fmg\" (UniqueName: \"kubernetes.io/projected/30d4a564-46dd-46fa-8ea4-d2b8166f3b89-kube-api-access-r7fmg\") pod \"openshift-apiserver-operator-796bbdcf4f-bkqrp\" (UID: \"30d4a564-46dd-46fa-8ea4-d2b8166f3b89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.138763 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.138914 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.638882327 +0000 UTC m=+143.153623375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.139210 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.139688 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.639623396 +0000 UTC m=+143.154364444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.156256 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24vh9\" (UniqueName: \"kubernetes.io/projected/25e6cec0-1b44-4424-b931-81b30b582922-kube-api-access-24vh9\") pod \"controller-manager-879f6c89f-w8s9q\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.179135 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9fhj\" (UniqueName: \"kubernetes.io/projected/bc118c5d-7578-4c43-9a17-ffd83062571d-kube-api-access-g9fhj\") pod \"apiserver-76f77b778f-hmpsg\" (UID: \"bc118c5d-7578-4c43-9a17-ffd83062571d\") " pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.179467 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.200701 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxzxh\" (UniqueName: \"kubernetes.io/projected/d5c71177-1d9b-4f85-9431-1a4c421281ce-kube-api-access-wxzxh\") pod \"downloads-7954f5f757-47lzz\" (UID: \"d5c71177-1d9b-4f85-9431-1a4c421281ce\") " pod="openshift-console/downloads-7954f5f757-47lzz" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.202767 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.217664 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.220867 4845 request.go:700] Waited for 1.914930537s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.227517 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.229099 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.239307 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.240226 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.240631 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.740597379 +0000 UTC m=+143.255338437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.240844 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.241319 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.741299907 +0000 UTC m=+143.256040955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.242445 4845 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.248525 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.255981 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.262899 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.272831 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f011803-cc1d-4dac-8bdd-d954067c3ab3-cert\") pod \"ingress-canary-g5lp9\" (UID: \"6f011803-cc1d-4dac-8bdd-d954067c3ab3\") " pod="openshift-ingress-canary/ingress-canary-g5lp9" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.285793 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.286089 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.325954 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-47lzz" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.330544 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.330636 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.330789 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.355910 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.356738 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.856719274 +0000 UTC m=+143.371460282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.365403 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.367414 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.368073 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.378214 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60b102b-be29-4bcc-b0da-cb395cb25949-metrics-tls\") pod \"dns-default-kf7gv\" (UID: \"b60b102b-be29-4bcc-b0da-cb395cb25949\") " pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.387046 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.388283 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b60b102b-be29-4bcc-b0da-cb395cb25949-config-volume\") pod \"dns-default-kf7gv\" (UID: \"b60b102b-be29-4bcc-b0da-cb395cb25949\") " pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.417835 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dlw9\" (UniqueName: \"kubernetes.io/projected/57b1e5a3-759a-45be-bea3-f102b6603edb-kube-api-access-2dlw9\") pod \"openshift-controller-manager-operator-756b6f6bc6-f2fng\" (UID: \"57b1e5a3-759a-45be-bea3-f102b6603edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.425922 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.438554 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28gzc\" (UniqueName: \"kubernetes.io/projected/97db6a94-9cba-47b5-bccb-3653396a9b3f-kube-api-access-28gzc\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.458696 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.459283 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:38.959260157 +0000 UTC m=+143.474001165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.461355 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v6dlz"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.462160 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bc08931-ed88-44c4-be4f-3d38e3b2564c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cp7cw\" (UID: \"2bc08931-ed88-44c4-be4f-3d38e3b2564c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.484110 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56wx7\" (UniqueName: \"kubernetes.io/projected/0736e7ec-281a-4e6b-9665-531846db4828-kube-api-access-56wx7\") pod \"machine-config-operator-74547568cd-6ftmh\" (UID: \"0736e7ec-281a-4e6b-9665-531846db4828\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.489296 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.505850 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcggm\" (UniqueName: \"kubernetes.io/projected/791544e2-a968-422b-9c6e-db760c2c0b7a-kube-api-access-jcggm\") pod \"openshift-config-operator-7777fb866f-sdgkg\" (UID: \"791544e2-a968-422b-9c6e-db760c2c0b7a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.518041 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x7tj\" (UniqueName: \"kubernetes.io/projected/5d009d0a-a266-4988-b410-9f0b99b66f2f-kube-api-access-8x7tj\") pod \"collect-profiles-29328885-29fwd\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:38 crc kubenswrapper[4845]: W1006 06:47:38.535906 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55d881a0_9c07_42e7_aed8_8a883c4b1ff5.slice/crio-30aa4041ac1fa882f8ad9834497173e95378fddda515c703e54666c9def1b2f0 WatchSource:0}: Error finding container 30aa4041ac1fa882f8ad9834497173e95378fddda515c703e54666c9def1b2f0: Status 404 returned error can't find the container with id 30aa4041ac1fa882f8ad9834497173e95378fddda515c703e54666c9def1b2f0 Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.535967 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qg6\" (UniqueName: \"kubernetes.io/projected/4a5668eb-22b9-4eca-b0fa-6c53e83da118-kube-api-access-m5qg6\") pod \"oauth-openshift-558db77b4-bhflm\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:38 crc kubenswrapper[4845]: W1006 06:47:38.537415 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb245e98d_5e97_4ab4_b35c_044899fab150.slice/crio-26ffde206572845ebb94da954fbeaf9daf7ba310aaeb63c9fb8e958fcb6db186 WatchSource:0}: Error finding container 26ffde206572845ebb94da954fbeaf9daf7ba310aaeb63c9fb8e958fcb6db186: Status 404 returned error can't find the container with id 26ffde206572845ebb94da954fbeaf9daf7ba310aaeb63c9fb8e958fcb6db186 Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.557927 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7phnq\" (UniqueName: \"kubernetes.io/projected/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-kube-api-access-7phnq\") pod \"marketplace-operator-79b997595-lljhd\" (UID: \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\") " pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.561574 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.562120 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:39.062104697 +0000 UTC m=+143.576845705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.575588 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97db6a94-9cba-47b5-bccb-3653396a9b3f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mltmv\" (UID: \"97db6a94-9cba-47b5-bccb-3653396a9b3f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.582924 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.590132 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.595201 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhqcx\" (UniqueName: \"kubernetes.io/projected/a54f269f-d3bd-4122-b8c0-9ee2b05d69bc-kube-api-access-bhqcx\") pod \"olm-operator-6b444d44fb-qh7xd\" (UID: \"a54f269f-d3bd-4122-b8c0-9ee2b05d69bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.597357 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.614974 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp4vr\" (UniqueName: \"kubernetes.io/projected/e4309f67-a2c8-4347-ac7f-95b0e9e43317-kube-api-access-fp4vr\") pod \"apiserver-7bbb656c7d-htk6t\" (UID: \"e4309f67-a2c8-4347-ac7f-95b0e9e43317\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.643175 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh44f\" (UniqueName: \"kubernetes.io/projected/d60904fb-5e6c-42aa-bb1b-dcd22661b23b-kube-api-access-mh44f\") pod \"service-ca-operator-777779d784-78skb\" (UID: \"d60904fb-5e6c-42aa-bb1b-dcd22661b23b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.657598 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jbqg\" (UniqueName: \"kubernetes.io/projected/3461c091-3cf2-436e-a4fc-cbfe52d71d45-kube-api-access-6jbqg\") pod \"packageserver-d55dfcdfc-dpq5f\" (UID: \"3461c091-3cf2-436e-a4fc-cbfe52d71d45\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.664905 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-f7jz6"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.665925 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.666420 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:39.166405564 +0000 UTC m=+143.681146572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.674733 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.677769 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.690737 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2095ab7-406e-40fe-84b1-72bac447e2c9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-89k76\" (UID: \"e2095ab7-406e-40fe-84b1-72bac447e2c9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.696562 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qd8\" (UniqueName: \"kubernetes.io/projected/e810fd52-7506-4c72-98b3-97c6d1757550-kube-api-access-s9qd8\") pod \"dns-operator-744455d44c-lctgg\" (UID: \"e810fd52-7506-4c72-98b3-97c6d1757550\") " pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.721982 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djv5d\" (UniqueName: \"kubernetes.io/projected/8a261240-d054-4d68-94d3-d0c675cfdde5-kube-api-access-djv5d\") pod \"authentication-operator-69f744f599-7pmm8\" (UID: \"8a261240-d054-4d68-94d3-d0c675cfdde5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.742108 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhlw6\" (UniqueName: \"kubernetes.io/projected/6336ebe1-aab1-4666-bc13-d6a1d4feedae-kube-api-access-zhlw6\") pod \"multus-admission-controller-857f4d67dd-4sh9g\" (UID: \"6336ebe1-aab1-4666-bc13-d6a1d4feedae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4sh9g" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.745931 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.756902 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2wnd\" (UniqueName: \"kubernetes.io/projected/82e84adb-480c-4357-aa9c-a92e1913f386-kube-api-access-g2wnd\") pod \"router-default-5444994796-jrr8k\" (UID: \"82e84adb-480c-4357-aa9c-a92e1913f386\") " pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.759693 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.769976 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.770641 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.771412 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:39.271361938 +0000 UTC m=+143.786102946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.776927 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63275f52-7475-4177-ad94-9d2875bf90eb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cbspk\" (UID: \"63275f52-7475-4177-ad94-9d2875bf90eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.788694 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4sh9g" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.800214 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2x6k\" (UniqueName: \"kubernetes.io/projected/5e51318d-7bd1-442b-9283-1f6a305a5a4c-kube-api-access-r2x6k\") pod \"machine-config-controller-84d6567774-8rm5c\" (UID: \"5e51318d-7bd1-442b-9283-1f6a305a5a4c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.814219 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.825909 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.829702 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lmzn\" (UniqueName: \"kubernetes.io/projected/0d7c4789-02fd-412a-9d10-b8b36250c32e-kube-api-access-4lmzn\") pod \"kube-storage-version-migrator-operator-b67b599dd-v46pw\" (UID: \"0d7c4789-02fd-412a-9d10-b8b36250c32e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.831222 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.837093 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.846176 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-v6457"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.846241 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.846848 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.853190 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jz5c\" (UniqueName: \"kubernetes.io/projected/098443e0-56cc-4df4-81d4-0dbd6894934e-kube-api-access-6jz5c\") pod \"service-ca-9c57cc56f-kq4kh\" (UID: \"098443e0-56cc-4df4-81d4-0dbd6894934e\") " pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.858598 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.862347 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpbgk\" (UniqueName: \"kubernetes.io/projected/28776653-0824-40e8-8b6d-bd0380b34b2e-kube-api-access-rpbgk\") pod \"catalog-operator-68c6474976-jmx5m\" (UID: \"28776653-0824-40e8-8b6d-bd0380b34b2e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.866316 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.872645 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.873023 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:39.373010938 +0000 UTC m=+143.887751946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.879200 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zctf\" (UniqueName: \"kubernetes.io/projected/c465f78d-6eef-4782-b7d3-844c814fadea-kube-api-access-6zctf\") pod \"machine-config-server-26wxt\" (UID: \"c465f78d-6eef-4782-b7d3-844c814fadea\") " pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.889755 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.915598 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.917466 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.922846 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.923281 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-26wxt" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.926275 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wjwx\" (UniqueName: \"kubernetes.io/projected/e27d02e6-bcff-4a2e-a494-834fae1948c7-kube-api-access-7wjwx\") pod \"package-server-manager-789f6589d5-777zb\" (UID: \"e27d02e6-bcff-4a2e-a494-834fae1948c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.951414 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x95sh\" (UniqueName: \"kubernetes.io/projected/eeeca672-ad97-48c4-b789-1393b16a0ec8-kube-api-access-x95sh\") pod \"machine-approver-56656f9798-72mf4\" (UID: \"eeeca672-ad97-48c4-b789-1393b16a0ec8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.951675 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" event={"ID":"57b1e5a3-759a-45be-bea3-f102b6603edb","Type":"ContainerStarted","Data":"f27fe0998fb7b534e018542895f9447de1d13d066064e68cc33c4fbed81797b6"} Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.954518 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" event={"ID":"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e","Type":"ContainerStarted","Data":"6f78eea230f076e816f4e45322b39eb8122f07b4d9413110cc8cf044ae7a2355"} Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.955000 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-w8s9q"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.963451 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-47lzz"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.963497 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" event={"ID":"b245e98d-5e97-4ab4-b35c-044899fab150","Type":"ContainerStarted","Data":"21bf98ab32c0bea0c0e265ed75812811062db9c4306106380fa30420bc47f0ba"} Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.963524 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" event={"ID":"b245e98d-5e97-4ab4-b35c-044899fab150","Type":"ContainerStarted","Data":"26ffde206572845ebb94da954fbeaf9daf7ba310aaeb63c9fb8e958fcb6db186"} Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.964312 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.964564 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.964965 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kb4gk"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.965316 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zmtb\" (UniqueName: \"kubernetes.io/projected/b323714e-8f3c-4e94-8785-0ea3d1418677-kube-api-access-6zmtb\") pod \"migrator-59844c95c7-wxrw4\" (UID: \"b323714e-8f3c-4e94-8785-0ea3d1418677\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wxrw4" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.965977 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hmpsg"] Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.966157 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvmjd\" (UniqueName: \"kubernetes.io/projected/38106f1d-b5d4-4d89-b79b-2a6173fe76c2-kube-api-access-vvmjd\") pod \"control-plane-machine-set-operator-78cbb6b69f-g4k48\" (UID: \"38106f1d-b5d4-4d89-b79b-2a6173fe76c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.967756 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v6dlz" event={"ID":"55d881a0-9c07-42e7-aed8-8a883c4b1ff5","Type":"ContainerStarted","Data":"557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df"} Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.967790 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v6dlz" event={"ID":"55d881a0-9c07-42e7-aed8-8a883c4b1ff5","Type":"ContainerStarted","Data":"30aa4041ac1fa882f8ad9834497173e95378fddda515c703e54666c9def1b2f0"} Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.972162 4845 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-wh5lq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.972206 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" podUID="b245e98d-5e97-4ab4-b35c-044899fab150" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.974317 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.974492 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:39.474469973 +0000 UTC m=+143.989210981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.974625 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:38 crc kubenswrapper[4845]: E1006 06:47:38.975002 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:39.474993286 +0000 UTC m=+143.989734504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.977899 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" event={"ID":"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7","Type":"ContainerStarted","Data":"dc08c7c42ea7b80d001c3ef583b6d93bcd53c16fe9ddacef28eb13ffea26f8e4"} Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.980974 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fp7c\" (UniqueName: \"kubernetes.io/projected/2bc08931-ed88-44c4-be4f-3d38e3b2564c-kube-api-access-5fp7c\") pod \"ingress-operator-5b745b69d9-cp7cw\" (UID: \"2bc08931-ed88-44c4-be4f-3d38e3b2564c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.987962 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" Oct 06 06:47:38 crc kubenswrapper[4845]: I1006 06:47:38.994139 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4f5cb42-510a-4970-b36b-9d124665b46e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pczzm\" (UID: \"a4f5cb42-510a-4970-b36b-9d124665b46e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.027938 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2kj8\" (UniqueName: \"kubernetes.io/projected/aea19779-3ce8-4dad-8c79-fa74016ec424-kube-api-access-z2kj8\") pod \"csi-hostpathplugin-lx59c\" (UID: \"aea19779-3ce8-4dad-8c79-fa74016ec424\") " pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.029689 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.037053 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wxrw4" Oct 06 06:47:39 crc kubenswrapper[4845]: W1006 06:47:39.038476 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687eb311_8e3f_424c_8adf_b6637e656585.slice/crio-4b9331dc3b645e4e4843ac0afc37428dd05b0eb8bfcf7cad0dc5c5694628bb4d WatchSource:0}: Error finding container 4b9331dc3b645e4e4843ac0afc37428dd05b0eb8bfcf7cad0dc5c5694628bb4d: Status 404 returned error can't find the container with id 4b9331dc3b645e4e4843ac0afc37428dd05b0eb8bfcf7cad0dc5c5694628bb4d Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.043477 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmc6z\" (UniqueName: \"kubernetes.io/projected/6f011803-cc1d-4dac-8bdd-d954067c3ab3-kube-api-access-lmc6z\") pod \"ingress-canary-g5lp9\" (UID: \"6f011803-cc1d-4dac-8bdd-d954067c3ab3\") " pod="openshift-ingress-canary/ingress-canary-g5lp9" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.052360 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.057483 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzsdq\" (UniqueName: \"kubernetes.io/projected/b60b102b-be29-4bcc-b0da-cb395cb25949-kube-api-access-hzsdq\") pod \"dns-default-kf7gv\" (UID: \"b60b102b-be29-4bcc-b0da-cb395cb25949\") " pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.073616 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.075577 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.076336 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:39.576311578 +0000 UTC m=+144.091052586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.081027 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.149961 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.167174 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.177555 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.177915 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:39.677903607 +0000 UTC m=+144.192644615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.216352 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg"] Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.237124 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.242524 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bhflm"] Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.258648 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-lx59c" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.263581 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g5lp9" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.270583 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.278155 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.278577 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:39.778561332 +0000 UTC m=+144.293302340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.360129 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t"] Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.361307 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7pmm8"] Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.379877 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.380275 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:39.880257423 +0000 UTC m=+144.394998431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.427017 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh"] Oct 06 06:47:39 crc kubenswrapper[4845]: W1006 06:47:39.466753 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a5668eb_22b9_4eca_b0fa_6c53e83da118.slice/crio-dd2752b0ee2981662282121f2f43262a0e1d2f84a339ad7cd0e344738b50dbc0 WatchSource:0}: Error finding container dd2752b0ee2981662282121f2f43262a0e1d2f84a339ad7cd0e344738b50dbc0: Status 404 returned error can't find the container with id dd2752b0ee2981662282121f2f43262a0e1d2f84a339ad7cd0e344738b50dbc0 Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.481770 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.481898 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:39.981875342 +0000 UTC m=+144.496616350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.482401 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.482685 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:39.982670252 +0000 UTC m=+144.497411260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.583893 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.584854 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:40.084833355 +0000 UTC m=+144.599574363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.631904 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76"] Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.685593 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.685884 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:40.1858737 +0000 UTC m=+144.700614708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.727676 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd"] Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.786038 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.786273 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:40.286250768 +0000 UTC m=+144.800991766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.786466 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.786821 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:40.286810272 +0000 UTC m=+144.801551280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.889998 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.890618 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:40.390591736 +0000 UTC m=+144.905332744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.890857 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.891196 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:40.391183401 +0000 UTC m=+144.905924409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:39 crc kubenswrapper[4845]: I1006 06:47:39.991862 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:39 crc kubenswrapper[4845]: E1006 06:47:39.992346 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:40.492331758 +0000 UTC m=+145.007072766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.056633 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" event={"ID":"0736e7ec-281a-4e6b-9665-531846db4828","Type":"ContainerStarted","Data":"bbf8492f2e4ea77c985c53be9d3ba8c155eec0661d24a2d6d0585e46eca67014"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.068697 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" event={"ID":"791544e2-a968-422b-9c6e-db760c2c0b7a","Type":"ContainerStarted","Data":"7fdd8658dfe68035eb93b59215533d2d546ded4f9a1893b1b63872ccced066a5"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.072695 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" podStartSLOduration=123.07268074 podStartE2EDuration="2m3.07268074s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:40.071823228 +0000 UTC m=+144.586564256" watchObservedRunningTime="2025-10-06 06:47:40.07268074 +0000 UTC m=+144.587421748" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.083630 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" event={"ID":"5d009d0a-a266-4988-b410-9f0b99b66f2f","Type":"ContainerStarted","Data":"0c56dfb1bd6701038d808614418b1beead073ac4ec1e8a5f9d5f5c123af04dcc"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.083684 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" event={"ID":"5d009d0a-a266-4988-b410-9f0b99b66f2f","Type":"ContainerStarted","Data":"46d1f531c0e03dad0f59a832b375934929b4d4915a2a2d2def999e6e09c9a34a"} Oct 06 06:47:40 crc kubenswrapper[4845]: E1006 06:47:40.094284 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:40.594258635 +0000 UTC m=+145.108999643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.100520 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6" event={"ID":"feb76755-71e9-4b98-b6f6-f6961e84f273","Type":"ContainerStarted","Data":"2cf123609a2ebb6deaaed413e2ac0d903d385f131bfc6ded6a42f4f88d716c35"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.106062 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" event={"ID":"a54f269f-d3bd-4122-b8c0-9ee2b05d69bc","Type":"ContainerStarted","Data":"f40217ed37b6cfa83298e524da16b52ec639106a7ca60834759820bf514cbfeb"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.107848 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" event={"ID":"bc118c5d-7578-4c43-9a17-ffd83062571d","Type":"ContainerStarted","Data":"5556ae7c5498f9ed5ea2b9f8909e3bfb8eb1d3d9374eddd2c2d69f419278e81b"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.136312 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" event={"ID":"e4309f67-a2c8-4347-ac7f-95b0e9e43317","Type":"ContainerStarted","Data":"81c01f0853da6f1dc6fe97a06d9427cbdd3d6e07a17da73e8e604eef8a051257"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.093359 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.180297 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" event={"ID":"97db6a94-9cba-47b5-bccb-3653396a9b3f","Type":"ContainerStarted","Data":"ecfdf2641960839726c2900c06c74e3f7a37dc51a21950b6c71ba52ed71b10f3"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.180348 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" event={"ID":"97db6a94-9cba-47b5-bccb-3653396a9b3f","Type":"ContainerStarted","Data":"013105108ba276b21a83058487e4e7c2e0e1f3fc34b6d2427f38cbcfb07ad408"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.199282 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" event={"ID":"57b1e5a3-759a-45be-bea3-f102b6603edb","Type":"ContainerStarted","Data":"4a11303056b825b9cbd5aa9f00ef107c0820f58ce619186f457ac936936db601"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.263725 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:40 crc kubenswrapper[4845]: E1006 06:47:40.264326 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:40.764305405 +0000 UTC m=+145.279046413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.267290 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-v6dlz" podStartSLOduration=124.26726981 podStartE2EDuration="2m4.26726981s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:40.176794462 +0000 UTC m=+144.691535500" watchObservedRunningTime="2025-10-06 06:47:40.26726981 +0000 UTC m=+144.782010818" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.311443 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" event={"ID":"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e","Type":"ContainerStarted","Data":"e7ccaca1de5e22a56acccf99f916648240c44e2c75d8693799f9323230e72870"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.332349 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" event={"ID":"e2095ab7-406e-40fe-84b1-72bac447e2c9","Type":"ContainerStarted","Data":"2f3b14ee9417f8543d67eeeff1f1273531fbb6bdbd82cca6deb59a83576cf310"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.340055 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-26wxt" event={"ID":"c465f78d-6eef-4782-b7d3-844c814fadea","Type":"ContainerStarted","Data":"75dd7b945080ca0f22d97b1c4e3155e9da99615dfcd67da01dccf5652025da5a"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.350887 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" event={"ID":"eeeca672-ad97-48c4-b789-1393b16a0ec8","Type":"ContainerStarted","Data":"6522cf5d3ecbeecc609db84d5712ef8af2461440ad09e94e670f36c81ec99579"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.360892 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" event={"ID":"25e6cec0-1b44-4424-b931-81b30b582922","Type":"ContainerStarted","Data":"073f197a8d337ba774cb6ffcd3448a43309c34b94068fe6b8912bedcf989fb9f"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.361514 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.365275 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:40 crc kubenswrapper[4845]: E1006 06:47:40.367829 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:40.867817142 +0000 UTC m=+145.382558150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.380746 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4sh9g"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.386181 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jrr8k" event={"ID":"82e84adb-480c-4357-aa9c-a92e1913f386","Type":"ContainerStarted","Data":"489a8ef441963243c20df6c31eb89ac49c3ac18b237d33da58ec7e34da55d7a3"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.386224 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jrr8k" event={"ID":"82e84adb-480c-4357-aa9c-a92e1913f386","Type":"ContainerStarted","Data":"eb9207f509b0a663a991c84bf059ac75fcf75f3e6a4955f7575e0acff46a2815"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.393260 4845 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-w8s9q container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.393320 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" podUID="25e6cec0-1b44-4424-b931-81b30b582922" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.397810 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" event={"ID":"30d4a564-46dd-46fa-8ea4-d2b8166f3b89","Type":"ContainerStarted","Data":"6ef6588b0475c705ade37daa99303d82aebdaf0553e462aa86a1d7f760b86b97"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.404670 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.417095 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-47lzz" event={"ID":"d5c71177-1d9b-4f85-9431-1a4c421281ce","Type":"ContainerStarted","Data":"6e8fd7d2cc75d3a1155701926f0fbe040dcd480d3d04345713ce0261f8fe3153"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.419420 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-47lzz" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.432994 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" event={"ID":"4a5668eb-22b9-4eca-b0fa-6c53e83da118","Type":"ContainerStarted","Data":"dd2752b0ee2981662282121f2f43262a0e1d2f84a339ad7cd0e344738b50dbc0"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.433836 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.438293 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" event={"ID":"8a261240-d054-4d68-94d3-d0c675cfdde5","Type":"ContainerStarted","Data":"8e1799951ccc2585a16ed559457390738641e77b20d0acf67add2b5ca36ce081"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.447001 4845 patch_prober.go:28] interesting pod/downloads-7954f5f757-47lzz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.447062 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-47lzz" podUID="d5c71177-1d9b-4f85-9431-1a4c421281ce" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.453985 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" event={"ID":"79a4e1d4-a8dc-4cc1-a4b7-450d8e9cc9c7","Type":"ContainerStarted","Data":"3cbf6655a25f8a0ae2085ee9a50a3c2a22dfff8508ea2792cd2cb228e1572a23"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.461057 4845 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-bhflm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" start-of-body= Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.461123 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" podUID="4a5668eb-22b9-4eca-b0fa-6c53e83da118" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.468261 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:40 crc kubenswrapper[4845]: E1006 06:47:40.468620 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:40.96858704 +0000 UTC m=+145.483328048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.469099 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:40 crc kubenswrapper[4845]: E1006 06:47:40.473067 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:40.973049293 +0000 UTC m=+145.487790291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.499245 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kb4gk" event={"ID":"687eb311-8e3f-424c-8adf-b6637e656585","Type":"ContainerStarted","Data":"4b9331dc3b645e4e4843ac0afc37428dd05b0eb8bfcf7cad0dc5c5694628bb4d"} Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.501458 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.518679 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.525633 4845 patch_prober.go:28] interesting pod/console-operator-58897d9998-kb4gk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.525699 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kb4gk" podUID="687eb311-8e3f-424c-8adf-b6637e656585" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.525768 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.571724 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:40 crc kubenswrapper[4845]: E1006 06:47:40.572428 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:41.072407945 +0000 UTC m=+145.587148953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.596881 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-78skb"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.609343 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lctgg"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.618450 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.620006 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lljhd"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.623987 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kq4kh"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.626189 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wxrw4"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.628145 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.632153 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.636470 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f"] Oct 06 06:47:40 crc kubenswrapper[4845]: W1006 06:47:40.647258 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd60904fb_5e6c_42aa_bb1b_dcd22661b23b.slice/crio-ba6978c5b8ef188d74734f44d65bd364af80ed39080915ab42c8b79fa6b45e36 WatchSource:0}: Error finding container ba6978c5b8ef188d74734f44d65bd364af80ed39080915ab42c8b79fa6b45e36: Status 404 returned error can't find the container with id ba6978c5b8ef188d74734f44d65bd364af80ed39080915ab42c8b79fa6b45e36 Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.679491 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:40 crc kubenswrapper[4845]: E1006 06:47:40.697288 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:41.197261862 +0000 UTC m=+145.712002870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.714191 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lx59c"] Oct 06 06:47:40 crc kubenswrapper[4845]: W1006 06:47:40.739678 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaea19779_3ce8_4dad_8c79_fa74016ec424.slice/crio-4900e7b58286fb64eff642116a675b5ee186ddced59368e67fc27bce0fb7831d WatchSource:0}: Error finding container 4900e7b58286fb64eff642116a675b5ee186ddced59368e67fc27bce0fb7831d: Status 404 returned error can't find the container with id 4900e7b58286fb64eff642116a675b5ee186ddced59368e67fc27bce0fb7831d Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.747239 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kf7gv"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.748701 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.768966 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g5lp9"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.776685 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.791661 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltmv" podStartSLOduration=123.791642988 podStartE2EDuration="2m3.791642988s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:40.78857621 +0000 UTC m=+145.303317208" watchObservedRunningTime="2025-10-06 06:47:40.791642988 +0000 UTC m=+145.306383996" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.791819 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:40 crc kubenswrapper[4845]: E1006 06:47:40.793360 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:41.293345811 +0000 UTC m=+145.808086819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.794573 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm"] Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.804107 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-47lzz" podStartSLOduration=123.804091053 podStartE2EDuration="2m3.804091053s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:40.802474462 +0000 UTC m=+145.317215470" watchObservedRunningTime="2025-10-06 06:47:40.804091053 +0000 UTC m=+145.318832061" Oct 06 06:47:40 crc kubenswrapper[4845]: W1006 06:47:40.845729 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38106f1d_b5d4_4d89_b79b_2a6173fe76c2.slice/crio-6a2cfd5422d1593e008cb282fffe3bc40e60c756a3df33ce17e4b14e44f98e0d WatchSource:0}: Error finding container 6a2cfd5422d1593e008cb282fffe3bc40e60c756a3df33ce17e4b14e44f98e0d: Status 404 returned error can't find the container with id 6a2cfd5422d1593e008cb282fffe3bc40e60c756a3df33ce17e4b14e44f98e0d Oct 06 06:47:40 crc kubenswrapper[4845]: W1006 06:47:40.848444 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f011803_cc1d_4dac_8bdd_d954067c3ab3.slice/crio-2ee9b7131b9dfad4aa88125e9456a159eace6612ca24762363b1277840855da5 WatchSource:0}: Error finding container 2ee9b7131b9dfad4aa88125e9456a159eace6612ca24762363b1277840855da5: Status 404 returned error can't find the container with id 2ee9b7131b9dfad4aa88125e9456a159eace6612ca24762363b1277840855da5 Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.895082 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:40 crc kubenswrapper[4845]: E1006 06:47:40.896680 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:41.396635652 +0000 UTC m=+145.911376660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.925846 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.940223 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:40 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:40 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:40 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:40 crc kubenswrapper[4845]: I1006 06:47:40.940267 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:40 crc kubenswrapper[4845]: E1006 06:47:40.997644 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:41.497613066 +0000 UTC m=+146.012354074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.000153 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.000677 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.001840 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" podStartSLOduration=124.001811372 podStartE2EDuration="2m4.001811372s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:40.990923826 +0000 UTC m=+145.505664834" watchObservedRunningTime="2025-10-06 06:47:41.001811372 +0000 UTC m=+145.516552380" Oct 06 06:47:41 crc kubenswrapper[4845]: E1006 06:47:41.004436 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:41.504409047 +0000 UTC m=+146.019150055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.105199 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:41 crc kubenswrapper[4845]: E1006 06:47:41.107324 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:41.607298399 +0000 UTC m=+146.122039407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.107406 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:41 crc kubenswrapper[4845]: E1006 06:47:41.107859 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:41.607851773 +0000 UTC m=+146.122592781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.177639 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" podStartSLOduration=124.17259263 podStartE2EDuration="2m4.17259263s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.166787513 +0000 UTC m=+145.681528531" watchObservedRunningTime="2025-10-06 06:47:41.17259263 +0000 UTC m=+145.687333638" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.178895 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f2fng" podStartSLOduration=124.178886129 podStartE2EDuration="2m4.178886129s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.12592689 +0000 UTC m=+145.640667908" watchObservedRunningTime="2025-10-06 06:47:41.178886129 +0000 UTC m=+145.693627137" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.208239 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:41 crc kubenswrapper[4845]: E1006 06:47:41.208432 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:41.708408645 +0000 UTC m=+146.223149653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.208582 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:41 crc kubenswrapper[4845]: E1006 06:47:41.208924 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:41.708916208 +0000 UTC m=+146.223657206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.240948 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" podStartSLOduration=124.240927597 podStartE2EDuration="2m4.240927597s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.239270956 +0000 UTC m=+145.754011974" watchObservedRunningTime="2025-10-06 06:47:41.240927597 +0000 UTC m=+145.755668595" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.314476 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:41 crc kubenswrapper[4845]: E1006 06:47:41.315841 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:41.815817991 +0000 UTC m=+146.330558999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.337314 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-26wxt" podStartSLOduration=5.3372869640000005 podStartE2EDuration="5.337286964s" podCreationTimestamp="2025-10-06 06:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.319558746 +0000 UTC m=+145.834299774" watchObservedRunningTime="2025-10-06 06:47:41.337286964 +0000 UTC m=+145.852027972" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.418773 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:41 crc kubenswrapper[4845]: E1006 06:47:41.419194 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:41.919177834 +0000 UTC m=+146.433918842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.425883 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" podStartSLOduration=125.425856703 podStartE2EDuration="2m5.425856703s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.410501145 +0000 UTC m=+145.925242163" watchObservedRunningTime="2025-10-06 06:47:41.425856703 +0000 UTC m=+145.940597711" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.518885 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" event={"ID":"e2095ab7-406e-40fe-84b1-72bac447e2c9","Type":"ContainerStarted","Data":"abc31b03691bce8e11a8a2cb69d4d748a6c8553274da201e67b1df71a642aa53"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.520567 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:41 crc kubenswrapper[4845]: E1006 06:47:41.522681 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.0226321 +0000 UTC m=+146.537373108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.531441 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-kb4gk" podStartSLOduration=125.531424072 podStartE2EDuration="2m5.531424072s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.500717736 +0000 UTC m=+146.015458744" watchObservedRunningTime="2025-10-06 06:47:41.531424072 +0000 UTC m=+146.046165080" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.535199 4845 generic.go:334] "Generic (PLEG): container finished" podID="bc118c5d-7578-4c43-9a17-ffd83062571d" containerID="a813f6568130eb7ae8df21bc07defa06e56d0eeffcb3c0923675929a9fc37e74" exitCode=0 Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.535251 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" event={"ID":"bc118c5d-7578-4c43-9a17-ffd83062571d","Type":"ContainerDied","Data":"a813f6568130eb7ae8df21bc07defa06e56d0eeffcb3c0923675929a9fc37e74"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.623504 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:41 crc kubenswrapper[4845]: E1006 06:47:41.627747 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.127733467 +0000 UTC m=+146.642474475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.640825 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jrr8k" podStartSLOduration=124.640810638 podStartE2EDuration="2m4.640810638s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.607816764 +0000 UTC m=+146.122557782" watchObservedRunningTime="2025-10-06 06:47:41.640810638 +0000 UTC m=+146.155551646" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.642067 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-f7jz6" podStartSLOduration=124.64206132 podStartE2EDuration="2m4.64206132s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.639659949 +0000 UTC m=+146.154400957" watchObservedRunningTime="2025-10-06 06:47:41.64206132 +0000 UTC m=+146.156802328" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.647727 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" event={"ID":"63275f52-7475-4177-ad94-9d2875bf90eb","Type":"ContainerStarted","Data":"0ea305ab97f2964acc50ec5ac5e2531e4f7c12d0d7241d75471a556753859bb2"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.647772 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" event={"ID":"63275f52-7475-4177-ad94-9d2875bf90eb","Type":"ContainerStarted","Data":"5faa472491cd7cbd66147de539c2c9faffe742693977e502c96cd19dcf3e50a2"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.665208 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" event={"ID":"2bc08931-ed88-44c4-be4f-3d38e3b2564c","Type":"ContainerStarted","Data":"d9ca2a2018dd4b59183526a8499155444085fc391cca1272a8c74b6fe41c1475"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.665258 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" event={"ID":"2bc08931-ed88-44c4-be4f-3d38e3b2564c","Type":"ContainerStarted","Data":"802bb4be10be3de5107818bbe2cd78a8feace6b1524358796f991ed5f625040c"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.671741 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6" event={"ID":"feb76755-71e9-4b98-b6f6-f6961e84f273","Type":"ContainerStarted","Data":"ae447ad7042c626e64b1020d278caef93f737ecd6f311a42f925b6dac24441bd"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.671779 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6" event={"ID":"feb76755-71e9-4b98-b6f6-f6961e84f273","Type":"ContainerStarted","Data":"a3d1e005251e4b98f8727318211ab97cca066ed43bad86153418ebd02bd3bc50"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.702898 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" podStartSLOduration=124.702883437 podStartE2EDuration="2m4.702883437s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.700581199 +0000 UTC m=+146.215322197" watchObservedRunningTime="2025-10-06 06:47:41.702883437 +0000 UTC m=+146.217624445" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.713238 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kb4gk" event={"ID":"687eb311-8e3f-424c-8adf-b6637e656585","Type":"ContainerStarted","Data":"2112b09c7a1f70183a292416cb7fe76c226b4a94d40099e205f4889beaba3ac0"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.715808 4845 patch_prober.go:28] interesting pod/console-operator-58897d9998-kb4gk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.715842 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kb4gk" podUID="687eb311-8e3f-424c-8adf-b6637e656585" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.724898 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:41 crc kubenswrapper[4845]: E1006 06:47:41.726128 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.226089824 +0000 UTC m=+146.740830832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.733750 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" event={"ID":"e27d02e6-bcff-4a2e-a494-834fae1948c7","Type":"ContainerStarted","Data":"c6efec0be95f89af52b3fec65833b3c2d9323481045b6bb773170cc4f8dc3486"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.780420 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-89k76" podStartSLOduration=124.780403547 podStartE2EDuration="2m4.780403547s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.774721964 +0000 UTC m=+146.289462972" watchObservedRunningTime="2025-10-06 06:47:41.780403547 +0000 UTC m=+146.295144555" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.799554 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" event={"ID":"0736e7ec-281a-4e6b-9665-531846db4828","Type":"ContainerStarted","Data":"a9f44e04797b2f5fbfa9722965b0e506ae622c4f7dacbdc74c7155935fe47014"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.799977 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" event={"ID":"0736e7ec-281a-4e6b-9665-531846db4828","Type":"ContainerStarted","Data":"673b6a8ff3ffae6973edc2c1a3257e08284a9a602ccddd991f15b4ac806b5da3"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.814031 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" event={"ID":"0d7c4789-02fd-412a-9d10-b8b36250c32e","Type":"ContainerStarted","Data":"90725de540a9a961069372ef5ef579114f53c855a7fb41fa1fe5e699656c3c08"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.814083 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" event={"ID":"0d7c4789-02fd-412a-9d10-b8b36250c32e","Type":"ContainerStarted","Data":"5008a535f2f2bb4f6682b2947fce5519eca87443f828d17b63deb93b7f253b12"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.825831 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:41 crc kubenswrapper[4845]: E1006 06:47:41.828205 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.328190666 +0000 UTC m=+146.842931674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.841910 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" event={"ID":"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67","Type":"ContainerStarted","Data":"94c39808c94f08ee0c79cb02694ec716dc89a1bc043addc0c7c7f120a0cf7d4f"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.842519 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.860404 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" event={"ID":"eeeca672-ad97-48c4-b789-1393b16a0ec8","Type":"ContainerStarted","Data":"215cad5fb6245bdadd1596c95e99e87371550b6a0dce6378bc6fde5950f09af6"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.860447 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" event={"ID":"eeeca672-ad97-48c4-b789-1393b16a0ec8","Type":"ContainerStarted","Data":"16b4d3af26edb6c64afe6bc97779f74fa9679a7d86a9f5d396814b042c09a357"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.863283 4845 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-lljhd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.863334 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" podUID="cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.876062 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qpxz6" podStartSLOduration=125.876038174 podStartE2EDuration="2m5.876038174s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.873235534 +0000 UTC m=+146.387976552" watchObservedRunningTime="2025-10-06 06:47:41.876038174 +0000 UTC m=+146.390779182" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.880415 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" event={"ID":"a4f5cb42-510a-4970-b36b-9d124665b46e","Type":"ContainerStarted","Data":"53ffbca5918dc9ac95cbb3c80e82384ddb3d852ee60d280b3e23ea4d549aa820"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.885970 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-26wxt" event={"ID":"c465f78d-6eef-4782-b7d3-844c814fadea","Type":"ContainerStarted","Data":"3ac371a914e1fc418c84a6a2bc7adc66bddf0652b43745043e058b5c59b751c6"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.887707 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" event={"ID":"28776653-0824-40e8-8b6d-bd0380b34b2e","Type":"ContainerStarted","Data":"8285524303ab8c57472881c07a6de4c5558ea6d7da28a2c867ac37f1fdd445de"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.887737 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" event={"ID":"28776653-0824-40e8-8b6d-bd0380b34b2e","Type":"ContainerStarted","Data":"020581d3c75ff3047a0553b567f443cb68c8b588ed1f39456703f838a6daad57"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.888361 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.890654 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" event={"ID":"4a5668eb-22b9-4eca-b0fa-6c53e83da118","Type":"ContainerStarted","Data":"e322a1f6cf5a317fb7d282441cdccf3fb214da309bfab95a262d1c49642babfd"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.895952 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bkqrp" event={"ID":"30d4a564-46dd-46fa-8ea4-d2b8166f3b89","Type":"ContainerStarted","Data":"782d19e43ac7d26c57ee4694256bb07e571680027762a8b106b3af172c102dea"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.897812 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cbspk" podStartSLOduration=124.897798665 podStartE2EDuration="2m4.897798665s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.896922372 +0000 UTC m=+146.411663370" watchObservedRunningTime="2025-10-06 06:47:41.897798665 +0000 UTC m=+146.412539673" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.908737 4845 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-jmx5m container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.908778 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" podUID="28776653-0824-40e8-8b6d-bd0380b34b2e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.910534 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" event={"ID":"098443e0-56cc-4df4-81d4-0dbd6894934e","Type":"ContainerStarted","Data":"b6cfb00b291aed0aedecd55efdf8ad4e9b32d7887c2326b84f5d6679ff9bc896"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.910602 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" event={"ID":"098443e0-56cc-4df4-81d4-0dbd6894934e","Type":"ContainerStarted","Data":"3e1106909b06cdc0aa64ba47fadaf1123782cad9911312e4693ec398592550f5"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.928577 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:41 crc kubenswrapper[4845]: E1006 06:47:41.928890 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.42887481 +0000 UTC m=+146.943615818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.932775 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lx59c" event={"ID":"aea19779-3ce8-4dad-8c79-fa74016ec424","Type":"ContainerStarted","Data":"4900e7b58286fb64eff642116a675b5ee186ddced59368e67fc27bce0fb7831d"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.934569 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:41 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:41 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:41 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.934612 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.946124 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4sh9g" event={"ID":"6336ebe1-aab1-4666-bc13-d6a1d4feedae","Type":"ContainerStarted","Data":"923eca4c0d8bfb9329e1e2808d73cf685a0567a6d8022cd68a28ca7e94bdde10"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.946179 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4sh9g" event={"ID":"6336ebe1-aab1-4666-bc13-d6a1d4feedae","Type":"ContainerStarted","Data":"20b84f144202edc493e8cab34ce577096b2569d0857fb4ce218ff91f82dd7fca"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.947708 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kf7gv" event={"ID":"b60b102b-be29-4bcc-b0da-cb395cb25949","Type":"ContainerStarted","Data":"b6bfb57c44eff5f8e3c3421a95d033a973986e3c43f91c1a01ccf65dcc8c2574"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.949892 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" event={"ID":"3461c091-3cf2-436e-a4fc-cbfe52d71d45","Type":"ContainerStarted","Data":"fb2e572a23a4387740368785c8cffbb5fe9632cf8100fd5329f896ed0fcbb1a3"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.949922 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" event={"ID":"3461c091-3cf2-436e-a4fc-cbfe52d71d45","Type":"ContainerStarted","Data":"3d2173f6edd2b04d7d3f779e40f8b2ee89329f8ce6e5f1b09897195853990bfe"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.950466 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.957482 4845 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-dpq5f container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.957542 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" podUID="3461c091-3cf2-436e-a4fc-cbfe52d71d45" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.957703 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" event={"ID":"a54f269f-d3bd-4122-b8c0-9ee2b05d69bc","Type":"ContainerStarted","Data":"9cd54c823edac4d62a6b194b5f6de9dd5672881a93063328a077c482352331fa"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.958663 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.968912 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v46pw" podStartSLOduration=124.968891382 podStartE2EDuration="2m4.968891382s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.938088073 +0000 UTC m=+146.452829071" watchObservedRunningTime="2025-10-06 06:47:41.968891382 +0000 UTC m=+146.483632390" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.990123 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.997324 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" event={"ID":"25e6cec0-1b44-4424-b931-81b30b582922","Type":"ContainerStarted","Data":"c27db7c8ea53c1be7bba6c9010ff11ac7b78de0d190861cd810206dcd771e885"} Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.999171 4845 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-w8s9q container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 06 06:47:41 crc kubenswrapper[4845]: I1006 06:47:41.999255 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" podUID="25e6cec0-1b44-4424-b931-81b30b582922" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.006138 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.007151 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-72mf4" podStartSLOduration=126.007138589 podStartE2EDuration="2m6.007138589s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.970446091 +0000 UTC m=+146.485187099" watchObservedRunningTime="2025-10-06 06:47:42.007138589 +0000 UTC m=+146.521879597" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.009257 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" podStartSLOduration=125.009248033 podStartE2EDuration="2m5.009248033s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:41.995413953 +0000 UTC m=+146.510154961" watchObservedRunningTime="2025-10-06 06:47:42.009248033 +0000 UTC m=+146.523989041" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.022409 4845 generic.go:334] "Generic (PLEG): container finished" podID="e4309f67-a2c8-4347-ac7f-95b0e9e43317" containerID="1c9526ffae9c60b12e34cadbf8ca7c71e0fa8b57a30657d96c78ee1546e93c73" exitCode=0 Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.022485 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" event={"ID":"e4309f67-a2c8-4347-ac7f-95b0e9e43317","Type":"ContainerDied","Data":"1c9526ffae9c60b12e34cadbf8ca7c71e0fa8b57a30657d96c78ee1546e93c73"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.030635 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.050333 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48" event={"ID":"38106f1d-b5d4-4d89-b79b-2a6173fe76c2","Type":"ContainerStarted","Data":"6a2cfd5422d1593e008cb282fffe3bc40e60c756a3df33ce17e4b14e44f98e0d"} Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.057783 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.557759649 +0000 UTC m=+147.072500657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.069528 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6ftmh" podStartSLOduration=125.069501426 podStartE2EDuration="2m5.069501426s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:42.057287867 +0000 UTC m=+146.572028865" watchObservedRunningTime="2025-10-06 06:47:42.069501426 +0000 UTC m=+146.584242434" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.140566 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.141126 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.641103246 +0000 UTC m=+147.155844254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.141455 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" podStartSLOduration=125.141344042 podStartE2EDuration="2m5.141344042s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:42.091442181 +0000 UTC m=+146.606183199" watchObservedRunningTime="2025-10-06 06:47:42.141344042 +0000 UTC m=+146.656085050" Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.143295 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.643268551 +0000 UTC m=+147.158009559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.142030 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.149243 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" event={"ID":"e810fd52-7506-4c72-98b3-97c6d1757550","Type":"ContainerStarted","Data":"42911ec61c9a2b2beeef94a48bb5313ddff01d0756010e81a9c1053ff43c22c4"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.149341 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" event={"ID":"e810fd52-7506-4c72-98b3-97c6d1757550","Type":"ContainerStarted","Data":"f80df207b54b3c7a4750ea79fa404ea043641978996ff507506ee544c8fbd2ec"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.161096 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" podStartSLOduration=125.161075231 podStartE2EDuration="2m5.161075231s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:42.130928619 +0000 UTC m=+146.645669627" watchObservedRunningTime="2025-10-06 06:47:42.161075231 +0000 UTC m=+146.675816229" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.198702 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wxrw4" event={"ID":"b323714e-8f3c-4e94-8785-0ea3d1418677","Type":"ContainerStarted","Data":"8633527ac9d28aa6e319a8f5ff301892d5ea5a669611b0f8578c1ee3d20f2e82"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.207022 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kq4kh" podStartSLOduration=125.207002492 podStartE2EDuration="2m5.207002492s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:42.177472986 +0000 UTC m=+146.692213994" watchObservedRunningTime="2025-10-06 06:47:42.207002492 +0000 UTC m=+146.721743490" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.250393 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.250530 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.750504652 +0000 UTC m=+147.265245660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.250618 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.250969 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.750958044 +0000 UTC m=+147.265699052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.296522 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48" podStartSLOduration=125.296494105 podStartE2EDuration="2m5.296494105s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:42.213787714 +0000 UTC m=+146.728528722" watchObservedRunningTime="2025-10-06 06:47:42.296494105 +0000 UTC m=+146.811235123" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.301931 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g5lp9" event={"ID":"6f011803-cc1d-4dac-8bdd-d954067c3ab3","Type":"ContainerStarted","Data":"e0b1659439494f4dacbcd7eb027a3959d1b1e651680c13199115956f9a7d82fa"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.301996 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g5lp9" event={"ID":"6f011803-cc1d-4dac-8bdd-d954067c3ab3","Type":"ContainerStarted","Data":"2ee9b7131b9dfad4aa88125e9456a159eace6612ca24762363b1277840855da5"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.312386 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-47lzz" event={"ID":"d5c71177-1d9b-4f85-9431-1a4c421281ce","Type":"ContainerStarted","Data":"1ec8de654d9606a2c628133090b949c0ad627d657b72f3571b55e253735c3408"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.319776 4845 patch_prober.go:28] interesting pod/downloads-7954f5f757-47lzz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.319850 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-47lzz" podUID="d5c71177-1d9b-4f85-9431-1a4c421281ce" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.355348 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.355675 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.85563446 +0000 UTC m=+147.370375468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.355766 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.356290 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.856283347 +0000 UTC m=+147.371024355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.356955 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qh7xd" podStartSLOduration=125.356934793 podStartE2EDuration="2m5.356934793s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:42.356849141 +0000 UTC m=+146.871590149" watchObservedRunningTime="2025-10-06 06:47:42.356934793 +0000 UTC m=+146.871675801" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.385923 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" event={"ID":"c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e","Type":"ContainerStarted","Data":"62e4003d96cdf5f8214a31876bc41cc84412f08f8d94c0608aafd9406ee17d0f"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.437586 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" event={"ID":"d60904fb-5e6c-42aa-bb1b-dcd22661b23b","Type":"ContainerStarted","Data":"c97f53c72e608c4001fa45794855d048b3edcceda8fbb2cfea572bceb5c25dff"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.437998 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" event={"ID":"d60904fb-5e6c-42aa-bb1b-dcd22661b23b","Type":"ContainerStarted","Data":"ba6978c5b8ef188d74734f44d65bd364af80ed39080915ab42c8b79fa6b45e36"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.458074 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.459173 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:42.959157848 +0000 UTC m=+147.473898856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.471029 4845 generic.go:334] "Generic (PLEG): container finished" podID="791544e2-a968-422b-9c6e-db760c2c0b7a" containerID="bc0a878801bdbd27c34de6cf677696336c76d0e688d2aaf117cd81a38ef43328" exitCode=0 Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.471160 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" event={"ID":"791544e2-a968-422b-9c6e-db760c2c0b7a","Type":"ContainerDied","Data":"bc0a878801bdbd27c34de6cf677696336c76d0e688d2aaf117cd81a38ef43328"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.483298 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-78skb" podStartSLOduration=125.483272298 podStartE2EDuration="2m5.483272298s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:42.482203931 +0000 UTC m=+146.996944949" watchObservedRunningTime="2025-10-06 06:47:42.483272298 +0000 UTC m=+146.998013316" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.546410 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-g5lp9" podStartSLOduration=6.546392143 podStartE2EDuration="6.546392143s" podCreationTimestamp="2025-10-06 06:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:42.517737409 +0000 UTC m=+147.032478407" watchObservedRunningTime="2025-10-06 06:47:42.546392143 +0000 UTC m=+147.061133151" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.548090 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" event={"ID":"5e51318d-7bd1-442b-9283-1f6a305a5a4c","Type":"ContainerStarted","Data":"d14514b59e4727861ab9b4797e8e2fd8ad9ed205a3d77f7ec6bea28c58c0c74a"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.548148 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" event={"ID":"5e51318d-7bd1-442b-9283-1f6a305a5a4c","Type":"ContainerStarted","Data":"ea3e46dcedca299f2e413be335c2a376c07de06e783708060f2fd523bf0bdd8c"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.560808 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.561886 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:43.061874565 +0000 UTC m=+147.576615573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.592711 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7pmm8" event={"ID":"8a261240-d054-4d68-94d3-d0c675cfdde5","Type":"ContainerStarted","Data":"9b26eb0277aad8dba604ff6542cb08c0576d8a452681eeed21f88681ca042641"} Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.610232 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-v6457" podStartSLOduration=125.610212027 podStartE2EDuration="2m5.610212027s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:42.548993659 +0000 UTC m=+147.063734677" watchObservedRunningTime="2025-10-06 06:47:42.610212027 +0000 UTC m=+147.124953035" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.637994 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" podStartSLOduration=125.637976659 podStartE2EDuration="2m5.637976659s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:42.635942778 +0000 UTC m=+147.150683786" watchObservedRunningTime="2025-10-06 06:47:42.637976659 +0000 UTC m=+147.152717677" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.663791 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.665416 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:43.165360241 +0000 UTC m=+147.680101409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.768349 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.768834 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:43.268819467 +0000 UTC m=+147.783560475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.872862 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.873098 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:43.373053153 +0000 UTC m=+147.887794161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.928607 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:42 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:42 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:42 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.928704 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:42 crc kubenswrapper[4845]: I1006 06:47:42.974883 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:42 crc kubenswrapper[4845]: E1006 06:47:42.975445 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:43.475425201 +0000 UTC m=+147.990166209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.076318 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:43 crc kubenswrapper[4845]: E1006 06:47:43.077330 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:43.577308177 +0000 UTC m=+148.092049185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.178623 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:43 crc kubenswrapper[4845]: E1006 06:47:43.179053 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:43.679033359 +0000 UTC m=+148.193774367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.280191 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:43 crc kubenswrapper[4845]: E1006 06:47:43.280403 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:43.780349331 +0000 UTC m=+148.295090339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.280540 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:43 crc kubenswrapper[4845]: E1006 06:47:43.280965 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:43.780915485 +0000 UTC m=+148.295656703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.382403 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:43 crc kubenswrapper[4845]: E1006 06:47:43.382832 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:43.882810251 +0000 UTC m=+148.397551259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.483859 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:43 crc kubenswrapper[4845]: E1006 06:47:43.484277 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:43.984261646 +0000 UTC m=+148.499002654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.585046 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:43 crc kubenswrapper[4845]: E1006 06:47:43.585522 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.085504286 +0000 UTC m=+148.600245294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.599932 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" event={"ID":"a4f5cb42-510a-4970-b36b-9d124665b46e","Type":"ContainerStarted","Data":"a81b6a98d8d036b11031c1b72cc794c36b5810f5523639cf7284ad53e5c8163e"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.602836 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-g4k48" event={"ID":"38106f1d-b5d4-4d89-b79b-2a6173fe76c2","Type":"ContainerStarted","Data":"1712eb5efab16b7a81f6f94fb677911c7470ba13453783a8ccd3de9c8c23b014"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.607441 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wxrw4" event={"ID":"b323714e-8f3c-4e94-8785-0ea3d1418677","Type":"ContainerStarted","Data":"ec9d7062a7502d78cfa9892402b2cbb9a1c97407cd4fe768e58e052c5a66bae0"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.607487 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wxrw4" event={"ID":"b323714e-8f3c-4e94-8785-0ea3d1418677","Type":"ContainerStarted","Data":"d8160d3c7fd75b386267833811c263c19e8001b7b81380b5f5469ee8d567be2f"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.661107 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" event={"ID":"791544e2-a968-422b-9c6e-db760c2c0b7a","Type":"ContainerStarted","Data":"f387fe2f3b8ca577de2d2044d46c511964207d5950c2b8e3e2c1b7a29d7b2de3"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.662017 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.684418 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8rm5c" event={"ID":"5e51318d-7bd1-442b-9283-1f6a305a5a4c","Type":"ContainerStarted","Data":"47d28f31317335cbe63de3fe666defd6788330a0fe351f4834fcdad34ae3f7d1"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.686092 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:43 crc kubenswrapper[4845]: E1006 06:47:43.687712 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.1876968 +0000 UTC m=+148.702437808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.700004 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" event={"ID":"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67","Type":"ContainerStarted","Data":"de5e7492999bc4fa13c81e395e0870d42ad33c800304a9c7a997d0d06184ac5e"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.702494 4845 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-lljhd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.702550 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" podUID="cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.703742 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4sh9g" event={"ID":"6336ebe1-aab1-4666-bc13-d6a1d4feedae","Type":"ContainerStarted","Data":"c95bb60bf2be644f7e2f6b02069378999b66ce9f277ee031269afc5f445a4b52"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.719519 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" event={"ID":"e27d02e6-bcff-4a2e-a494-834fae1948c7","Type":"ContainerStarted","Data":"ddbd51158d7bc54dae3d48f403cc21fe9c153f599444b6ba612c2c5d4ca9a2e2"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.719819 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" event={"ID":"e27d02e6-bcff-4a2e-a494-834fae1948c7","Type":"ContainerStarted","Data":"d0ad835bbd8e079793942541a349b96f07f868e9f6d4521cf229a23fa242e355"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.719843 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.721652 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" event={"ID":"e810fd52-7506-4c72-98b3-97c6d1757550","Type":"ContainerStarted","Data":"4f05a3a32eb68eae58847698de8814e996a24bee1540eaeebad088f8c322d968"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.739530 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pczzm" podStartSLOduration=126.7395155 podStartE2EDuration="2m6.7395155s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:43.719458723 +0000 UTC m=+148.234199731" watchObservedRunningTime="2025-10-06 06:47:43.7395155 +0000 UTC m=+148.254256508" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.742914 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" event={"ID":"bc118c5d-7578-4c43-9a17-ffd83062571d","Type":"ContainerStarted","Data":"4a6f5571a7b8610a1cabc62a0410150dfd1147f34ccf04456750c7400e9d645d"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.742976 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" event={"ID":"bc118c5d-7578-4c43-9a17-ffd83062571d","Type":"ContainerStarted","Data":"b86cf6abce5c846e9492278d001db92b3841876b90911442a380a16376e6e570"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.760278 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" event={"ID":"2bc08931-ed88-44c4-be4f-3d38e3b2564c","Type":"ContainerStarted","Data":"413993a8a775e299cf81ceb38e12a1921682831b8e1a8a6fb57bcd426303539d"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.782503 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wxrw4" podStartSLOduration=126.782483646 podStartE2EDuration="2m6.782483646s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:43.78064445 +0000 UTC m=+148.295385458" watchObservedRunningTime="2025-10-06 06:47:43.782483646 +0000 UTC m=+148.297224654" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.786917 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:43 crc kubenswrapper[4845]: E1006 06:47:43.788161 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.28814714 +0000 UTC m=+148.802888148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.801609 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" event={"ID":"e4309f67-a2c8-4347-ac7f-95b0e9e43317","Type":"ContainerStarted","Data":"e77cd40f203d5e636790a68e6d7edf29118060a02ca9994145c4bd3c35e0e03b"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.837314 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lx59c" event={"ID":"aea19779-3ce8-4dad-8c79-fa74016ec424","Type":"ContainerStarted","Data":"8517f2cf76d495f50c26dd11cbe22a64d90c1371bd8a846b8d4f82b3d97d26ad"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.848474 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" podStartSLOduration=127.848458245 podStartE2EDuration="2m7.848458245s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:43.848141687 +0000 UTC m=+148.362882695" watchObservedRunningTime="2025-10-06 06:47:43.848458245 +0000 UTC m=+148.363199243" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.877449 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kf7gv" event={"ID":"b60b102b-be29-4bcc-b0da-cb395cb25949","Type":"ContainerStarted","Data":"16032b677cb8595d7adb882311b72b6e459318c68895641bb30551184c1064d2"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.877486 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kf7gv" event={"ID":"b60b102b-be29-4bcc-b0da-cb395cb25949","Type":"ContainerStarted","Data":"1b42253ad8fcab31ba52f2b4206858cc51a7f4c9412337d6defee06d2525767d"} Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.877499 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.882615 4845 patch_prober.go:28] interesting pod/downloads-7954f5f757-47lzz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.882790 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-47lzz" podUID="d5c71177-1d9b-4f85-9431-1a4c421281ce" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.896225 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmx5m" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.897071 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:43 crc kubenswrapper[4845]: E1006 06:47:43.899719 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.39970447 +0000 UTC m=+148.914445468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.901792 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" podStartSLOduration=126.901780323 podStartE2EDuration="2m6.901780323s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:43.900046499 +0000 UTC m=+148.414787507" watchObservedRunningTime="2025-10-06 06:47:43.901780323 +0000 UTC m=+148.416521331" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.917669 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-kb4gk" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.924791 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.942554 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:43 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:43 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:43 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.942611 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:43 crc kubenswrapper[4845]: I1006 06:47:43.999397 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:44 crc kubenswrapper[4845]: E1006 06:47:44.002532 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.50251257 +0000 UTC m=+149.017253578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.058479 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-4sh9g" podStartSLOduration=127.058460994 podStartE2EDuration="2m7.058460994s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:43.963537604 +0000 UTC m=+148.478278612" watchObservedRunningTime="2025-10-06 06:47:44.058460994 +0000 UTC m=+148.573202002" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.101981 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:44 crc kubenswrapper[4845]: E1006 06:47:44.102347 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.602335414 +0000 UTC m=+149.117076422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.153681 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-lctgg" podStartSLOduration=127.153668661 podStartE2EDuration="2m7.153668661s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:44.152326438 +0000 UTC m=+148.667067446" watchObservedRunningTime="2025-10-06 06:47:44.153668661 +0000 UTC m=+148.668409669" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.154628 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kf7gv" podStartSLOduration=8.154622956 podStartE2EDuration="8.154622956s" podCreationTimestamp="2025-10-06 06:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:44.133645595 +0000 UTC m=+148.648386603" watchObservedRunningTime="2025-10-06 06:47:44.154622956 +0000 UTC m=+148.669363964" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.204938 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:44 crc kubenswrapper[4845]: E1006 06:47:44.205227 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.705210895 +0000 UTC m=+149.219951903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.221758 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" podStartSLOduration=127.221745693 podStartE2EDuration="2m7.221745693s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:44.220182463 +0000 UTC m=+148.734923471" watchObservedRunningTime="2025-10-06 06:47:44.221745693 +0000 UTC m=+148.736486691" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.259475 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" podStartSLOduration=127.259457847 podStartE2EDuration="2m7.259457847s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:44.258539643 +0000 UTC m=+148.773280661" watchObservedRunningTime="2025-10-06 06:47:44.259457847 +0000 UTC m=+148.774198855" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.306223 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:44 crc kubenswrapper[4845]: E1006 06:47:44.306632 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.806614869 +0000 UTC m=+149.321355877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.349543 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dpq5f" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.408015 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:44 crc kubenswrapper[4845]: E1006 06:47:44.408216 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:44.908190447 +0000 UTC m=+149.422931455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.408280 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.408326 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.408401 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.408443 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.410453 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.419318 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.419598 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.420050 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.424114 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cp7cw" podStartSLOduration=127.424091579 podStartE2EDuration="2m7.424091579s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:44.372743941 +0000 UTC m=+148.887484959" watchObservedRunningTime="2025-10-06 06:47:44.424091579 +0000 UTC m=+148.938832587" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.510735 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:44 crc kubenswrapper[4845]: E1006 06:47:44.511046 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:45.011033838 +0000 UTC m=+149.525774836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.540198 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.550632 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.555833 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.612456 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:44 crc kubenswrapper[4845]: E1006 06:47:44.612665 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:45.112632806 +0000 UTC m=+149.627373804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.612716 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:44 crc kubenswrapper[4845]: E1006 06:47:44.613199 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:45.11318799 +0000 UTC m=+149.627928998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.713912 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:44 crc kubenswrapper[4845]: E1006 06:47:44.714278 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:45.214248646 +0000 UTC m=+149.728989654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.714557 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:44 crc kubenswrapper[4845]: E1006 06:47:44.714866 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:45.214855141 +0000 UTC m=+149.729596149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.727454 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hnkmw"] Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.728296 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.736217 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.741276 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hnkmw"] Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.816077 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.816248 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ab198b9-c91e-4d19-a404-88eece82b9d6-catalog-content\") pod \"certified-operators-hnkmw\" (UID: \"3ab198b9-c91e-4d19-a404-88eece82b9d6\") " pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.816290 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ab198b9-c91e-4d19-a404-88eece82b9d6-utilities\") pod \"certified-operators-hnkmw\" (UID: \"3ab198b9-c91e-4d19-a404-88eece82b9d6\") " pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.816340 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8gt9\" (UniqueName: \"kubernetes.io/projected/3ab198b9-c91e-4d19-a404-88eece82b9d6-kube-api-access-l8gt9\") pod \"certified-operators-hnkmw\" (UID: \"3ab198b9-c91e-4d19-a404-88eece82b9d6\") " pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:47:44 crc kubenswrapper[4845]: E1006 06:47:44.816555 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:45.316538992 +0000 UTC m=+149.831280000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.910401 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n72q5"] Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.911698 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.912428 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lx59c" event={"ID":"aea19779-3ce8-4dad-8c79-fa74016ec424","Type":"ContainerStarted","Data":"6d703c8113888b8cd3ab492f32a79d024212be421d4a1673df0a5771f6a0d081"} Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.915060 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.918942 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8gt9\" (UniqueName: \"kubernetes.io/projected/3ab198b9-c91e-4d19-a404-88eece82b9d6-kube-api-access-l8gt9\") pod \"certified-operators-hnkmw\" (UID: \"3ab198b9-c91e-4d19-a404-88eece82b9d6\") " pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.919001 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.919036 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ab198b9-c91e-4d19-a404-88eece82b9d6-catalog-content\") pod \"certified-operators-hnkmw\" (UID: \"3ab198b9-c91e-4d19-a404-88eece82b9d6\") " pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.919076 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ab198b9-c91e-4d19-a404-88eece82b9d6-utilities\") pod \"certified-operators-hnkmw\" (UID: \"3ab198b9-c91e-4d19-a404-88eece82b9d6\") " pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.919281 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:47:44 crc kubenswrapper[4845]: E1006 06:47:44.919873 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:45.419862114 +0000 UTC m=+149.934603122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.920010 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ab198b9-c91e-4d19-a404-88eece82b9d6-utilities\") pod \"certified-operators-hnkmw\" (UID: \"3ab198b9-c91e-4d19-a404-88eece82b9d6\") " pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.920185 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ab198b9-c91e-4d19-a404-88eece82b9d6-catalog-content\") pod \"certified-operators-hnkmw\" (UID: \"3ab198b9-c91e-4d19-a404-88eece82b9d6\") " pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:47:44 crc kubenswrapper[4845]: I1006 06:47:44.986408 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n72q5"] Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.002506 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:45 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:45 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:45 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.002570 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.003452 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8gt9\" (UniqueName: \"kubernetes.io/projected/3ab198b9-c91e-4d19-a404-88eece82b9d6-kube-api-access-l8gt9\") pod \"certified-operators-hnkmw\" (UID: \"3ab198b9-c91e-4d19-a404-88eece82b9d6\") " pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.023258 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.023619 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-catalog-content\") pod \"community-operators-n72q5\" (UID: \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\") " pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.023902 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-utilities\") pod \"community-operators-n72q5\" (UID: \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\") " pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.024184 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbf4j\" (UniqueName: \"kubernetes.io/projected/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-kube-api-access-bbf4j\") pod \"community-operators-n72q5\" (UID: \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\") " pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:47:45 crc kubenswrapper[4845]: E1006 06:47:45.025305 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:45.52528171 +0000 UTC m=+150.040022768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.041590 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.126301 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-utilities\") pod \"community-operators-n72q5\" (UID: \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\") " pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.126355 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.126398 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbf4j\" (UniqueName: \"kubernetes.io/projected/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-kube-api-access-bbf4j\") pod \"community-operators-n72q5\" (UID: \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\") " pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.126482 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-catalog-content\") pod \"community-operators-n72q5\" (UID: \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\") " pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.126970 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-utilities\") pod \"community-operators-n72q5\" (UID: \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\") " pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:47:45 crc kubenswrapper[4845]: E1006 06:47:45.127207 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:45.627196386 +0000 UTC m=+150.141937394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.127782 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-catalog-content\") pod \"community-operators-n72q5\" (UID: \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\") " pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.170243 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6t54n"] Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.195823 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.239663 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.239787 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae038f22-b33b-4909-9d36-1b04c873e809-catalog-content\") pod \"certified-operators-6t54n\" (UID: \"ae038f22-b33b-4909-9d36-1b04c873e809\") " pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.239820 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcdd6\" (UniqueName: \"kubernetes.io/projected/ae038f22-b33b-4909-9d36-1b04c873e809-kube-api-access-vcdd6\") pod \"certified-operators-6t54n\" (UID: \"ae038f22-b33b-4909-9d36-1b04c873e809\") " pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.239903 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae038f22-b33b-4909-9d36-1b04c873e809-utilities\") pod \"certified-operators-6t54n\" (UID: \"ae038f22-b33b-4909-9d36-1b04c873e809\") " pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:47:45 crc kubenswrapper[4845]: E1006 06:47:45.240004 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:45.739984878 +0000 UTC m=+150.254725886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.244010 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbf4j\" (UniqueName: \"kubernetes.io/projected/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-kube-api-access-bbf4j\") pod \"community-operators-n72q5\" (UID: \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\") " pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.289654 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.342141 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae038f22-b33b-4909-9d36-1b04c873e809-utilities\") pod \"certified-operators-6t54n\" (UID: \"ae038f22-b33b-4909-9d36-1b04c873e809\") " pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.342180 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae038f22-b33b-4909-9d36-1b04c873e809-catalog-content\") pod \"certified-operators-6t54n\" (UID: \"ae038f22-b33b-4909-9d36-1b04c873e809\") " pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.342206 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcdd6\" (UniqueName: \"kubernetes.io/projected/ae038f22-b33b-4909-9d36-1b04c873e809-kube-api-access-vcdd6\") pod \"certified-operators-6t54n\" (UID: \"ae038f22-b33b-4909-9d36-1b04c873e809\") " pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.342230 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:45 crc kubenswrapper[4845]: E1006 06:47:45.342581 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:45.842569472 +0000 UTC m=+150.357310480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.343536 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae038f22-b33b-4909-9d36-1b04c873e809-utilities\") pod \"certified-operators-6t54n\" (UID: \"ae038f22-b33b-4909-9d36-1b04c873e809\") " pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.343795 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae038f22-b33b-4909-9d36-1b04c873e809-catalog-content\") pod \"certified-operators-6t54n\" (UID: \"ae038f22-b33b-4909-9d36-1b04c873e809\") " pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.387816 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fkq9g"] Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.389193 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.415544 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6t54n"] Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.443277 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcdd6\" (UniqueName: \"kubernetes.io/projected/ae038f22-b33b-4909-9d36-1b04c873e809-kube-api-access-vcdd6\") pod \"certified-operators-6t54n\" (UID: \"ae038f22-b33b-4909-9d36-1b04c873e809\") " pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.444072 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.444337 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293fea8c-d627-4852-84fb-c98e487df3a0-catalog-content\") pod \"community-operators-fkq9g\" (UID: \"293fea8c-d627-4852-84fb-c98e487df3a0\") " pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.444477 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7d66\" (UniqueName: \"kubernetes.io/projected/293fea8c-d627-4852-84fb-c98e487df3a0-kube-api-access-f7d66\") pod \"community-operators-fkq9g\" (UID: \"293fea8c-d627-4852-84fb-c98e487df3a0\") " pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.444576 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293fea8c-d627-4852-84fb-c98e487df3a0-utilities\") pod \"community-operators-fkq9g\" (UID: \"293fea8c-d627-4852-84fb-c98e487df3a0\") " pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:47:45 crc kubenswrapper[4845]: E1006 06:47:45.444829 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:45.944807157 +0000 UTC m=+150.459548165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.546327 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293fea8c-d627-4852-84fb-c98e487df3a0-catalog-content\") pod \"community-operators-fkq9g\" (UID: \"293fea8c-d627-4852-84fb-c98e487df3a0\") " pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.546401 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7d66\" (UniqueName: \"kubernetes.io/projected/293fea8c-d627-4852-84fb-c98e487df3a0-kube-api-access-f7d66\") pod \"community-operators-fkq9g\" (UID: \"293fea8c-d627-4852-84fb-c98e487df3a0\") " pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.546430 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293fea8c-d627-4852-84fb-c98e487df3a0-utilities\") pod \"community-operators-fkq9g\" (UID: \"293fea8c-d627-4852-84fb-c98e487df3a0\") " pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.546467 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:45 crc kubenswrapper[4845]: E1006 06:47:45.546807 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.046793364 +0000 UTC m=+150.561534372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.547248 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293fea8c-d627-4852-84fb-c98e487df3a0-catalog-content\") pod \"community-operators-fkq9g\" (UID: \"293fea8c-d627-4852-84fb-c98e487df3a0\") " pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.547277 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293fea8c-d627-4852-84fb-c98e487df3a0-utilities\") pod \"community-operators-fkq9g\" (UID: \"293fea8c-d627-4852-84fb-c98e487df3a0\") " pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.552405 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.556275 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fkq9g"] Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.578233 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7d66\" (UniqueName: \"kubernetes.io/projected/293fea8c-d627-4852-84fb-c98e487df3a0-kube-api-access-f7d66\") pod \"community-operators-fkq9g\" (UID: \"293fea8c-d627-4852-84fb-c98e487df3a0\") " pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.649594 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:45 crc kubenswrapper[4845]: E1006 06:47:45.650015 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.149988693 +0000 UTC m=+150.664729701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.650164 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:45 crc kubenswrapper[4845]: E1006 06:47:45.650467 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.150445345 +0000 UTC m=+150.665186353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.735623 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hnkmw"] Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.736103 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.751727 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:45 crc kubenswrapper[4845]: E1006 06:47:45.752238 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.252223278 +0000 UTC m=+150.766964286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.858229 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:45 crc kubenswrapper[4845]: E1006 06:47:45.860578 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.360561828 +0000 UTC m=+150.875302836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:45 crc kubenswrapper[4845]: W1006 06:47:45.874800 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-1f6f5503b6f2ba9f163bed814e61bcc860125be92bdd15c68f6bfd085b61ea7a WatchSource:0}: Error finding container 1f6f5503b6f2ba9f163bed814e61bcc860125be92bdd15c68f6bfd085b61ea7a: Status 404 returned error can't find the container with id 1f6f5503b6f2ba9f163bed814e61bcc860125be92bdd15c68f6bfd085b61ea7a Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.942002 4845 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.944543 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:45 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:45 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:45 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.944742 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.962959 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:45 crc kubenswrapper[4845]: E1006 06:47:45.964743 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.464711631 +0000 UTC m=+150.979452639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.966484 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:45 crc kubenswrapper[4845]: E1006 06:47:45.967669 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.467642855 +0000 UTC m=+150.982383923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:45 crc kubenswrapper[4845]: I1006 06:47:45.985934 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n72q5"] Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.002972 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lx59c" event={"ID":"aea19779-3ce8-4dad-8c79-fa74016ec424","Type":"ContainerStarted","Data":"95fdde64cd2b6dbaf2ba9dda850e5389424d966755c179e462a3ff635aadd6a6"} Oct 06 06:47:46 crc kubenswrapper[4845]: W1006 06:47:46.021136 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2eeb1a7_98d0_4b52_9c07_f9ba90f486e2.slice/crio-442d18c70ec3cf7c8a44dce1c229b7657abfdadca58f03b0c99a57c90d57a0e5 WatchSource:0}: Error finding container 442d18c70ec3cf7c8a44dce1c229b7657abfdadca58f03b0c99a57c90d57a0e5: Status 404 returned error can't find the container with id 442d18c70ec3cf7c8a44dce1c229b7657abfdadca58f03b0c99a57c90d57a0e5 Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.027118 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1f6f5503b6f2ba9f163bed814e61bcc860125be92bdd15c68f6bfd085b61ea7a"} Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.036867 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9b416fc3afbabf9cb6678e8efdab33a4f30b16a2b2774863362b9cea18cd215d"} Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.043714 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.044619 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.046428 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.049947 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.052412 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.064328 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnkmw" event={"ID":"3ab198b9-c91e-4d19-a404-88eece82b9d6","Type":"ContainerStarted","Data":"55141e59c522d29737414e4d68a8bba76e1f92fb3d2d85fe19320b1ddbce2f53"} Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.067113 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7badaaf3d9a633bf6a94278c815d2761b74ffecc096381c79641eab20c27de46"} Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.067344 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:46 crc kubenswrapper[4845]: E1006 06:47:46.067412 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.567396487 +0000 UTC m=+151.082137495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.068493 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:46 crc kubenswrapper[4845]: E1006 06:47:46.068751 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.568738061 +0000 UTC m=+151.083479069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.107950 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sdgkg" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.170965 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.171439 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a787847d-91a6-42f6-8650-b9f7b4156eb3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a787847d-91a6-42f6-8650-b9f7b4156eb3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.171540 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a787847d-91a6-42f6-8650-b9f7b4156eb3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a787847d-91a6-42f6-8650-b9f7b4156eb3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 06:47:46 crc kubenswrapper[4845]: E1006 06:47:46.171780 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.671762306 +0000 UTC m=+151.186503314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.273835 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.273881 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a787847d-91a6-42f6-8650-b9f7b4156eb3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a787847d-91a6-42f6-8650-b9f7b4156eb3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.273911 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a787847d-91a6-42f6-8650-b9f7b4156eb3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a787847d-91a6-42f6-8650-b9f7b4156eb3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.273990 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a787847d-91a6-42f6-8650-b9f7b4156eb3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a787847d-91a6-42f6-8650-b9f7b4156eb3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 06:47:46 crc kubenswrapper[4845]: E1006 06:47:46.274267 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.774256767 +0000 UTC m=+151.288997775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.316314 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a787847d-91a6-42f6-8650-b9f7b4156eb3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a787847d-91a6-42f6-8650-b9f7b4156eb3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.374769 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:46 crc kubenswrapper[4845]: E1006 06:47:46.375080 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.875066666 +0000 UTC m=+151.389807674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.397093 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.439654 4845 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-06T06:47:45.942050588Z","Handler":null,"Name":""} Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.443860 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6t54n"] Oct 06 06:47:46 crc kubenswrapper[4845]: W1006 06:47:46.472330 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae038f22_b33b_4909_9d36_1b04c873e809.slice/crio-e4a0b32360ecfc1c816c80a96da0345c953ea5aff815954e0d4bbfd45bfd43b8 WatchSource:0}: Error finding container e4a0b32360ecfc1c816c80a96da0345c953ea5aff815954e0d4bbfd45bfd43b8: Status 404 returned error can't find the container with id e4a0b32360ecfc1c816c80a96da0345c953ea5aff815954e0d4bbfd45bfd43b8 Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.476407 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:46 crc kubenswrapper[4845]: E1006 06:47:46.476797 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 06:47:46.976782538 +0000 UTC m=+151.491523546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n6tj" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.478530 4845 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.478558 4845 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.529884 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fkq9g"] Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.577088 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.631021 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.678534 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.684246 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.684285 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.710852 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lqcvc"] Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.714998 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.718245 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.726351 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqcvc"] Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.741226 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n6tj\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.784178 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa5e8be-a350-4d6d-b854-aabf6341b043-utilities\") pod \"redhat-marketplace-lqcvc\" (UID: \"1aa5e8be-a350-4d6d-b854-aabf6341b043\") " pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.784640 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa5e8be-a350-4d6d-b854-aabf6341b043-catalog-content\") pod \"redhat-marketplace-lqcvc\" (UID: \"1aa5e8be-a350-4d6d-b854-aabf6341b043\") " pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.784667 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz7kh\" (UniqueName: \"kubernetes.io/projected/1aa5e8be-a350-4d6d-b854-aabf6341b043-kube-api-access-pz7kh\") pod \"redhat-marketplace-lqcvc\" (UID: \"1aa5e8be-a350-4d6d-b854-aabf6341b043\") " pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.825781 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.859615 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.885704 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa5e8be-a350-4d6d-b854-aabf6341b043-catalog-content\") pod \"redhat-marketplace-lqcvc\" (UID: \"1aa5e8be-a350-4d6d-b854-aabf6341b043\") " pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.885743 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz7kh\" (UniqueName: \"kubernetes.io/projected/1aa5e8be-a350-4d6d-b854-aabf6341b043-kube-api-access-pz7kh\") pod \"redhat-marketplace-lqcvc\" (UID: \"1aa5e8be-a350-4d6d-b854-aabf6341b043\") " pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.885811 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa5e8be-a350-4d6d-b854-aabf6341b043-utilities\") pod \"redhat-marketplace-lqcvc\" (UID: \"1aa5e8be-a350-4d6d-b854-aabf6341b043\") " pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.886353 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa5e8be-a350-4d6d-b854-aabf6341b043-catalog-content\") pod \"redhat-marketplace-lqcvc\" (UID: \"1aa5e8be-a350-4d6d-b854-aabf6341b043\") " pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.886481 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa5e8be-a350-4d6d-b854-aabf6341b043-utilities\") pod \"redhat-marketplace-lqcvc\" (UID: \"1aa5e8be-a350-4d6d-b854-aabf6341b043\") " pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.907241 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz7kh\" (UniqueName: \"kubernetes.io/projected/1aa5e8be-a350-4d6d-b854-aabf6341b043-kube-api-access-pz7kh\") pod \"redhat-marketplace-lqcvc\" (UID: \"1aa5e8be-a350-4d6d-b854-aabf6341b043\") " pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.929195 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:46 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:46 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:46 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:46 crc kubenswrapper[4845]: I1006 06:47:46.929544 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.072939 4845 generic.go:334] "Generic (PLEG): container finished" podID="ae038f22-b33b-4909-9d36-1b04c873e809" containerID="6a5b1df7e31b44d00bef4eed33fbbdc1eb8405dfb1494a0cea5fc064b9d644c8" exitCode=0 Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.073018 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t54n" event={"ID":"ae038f22-b33b-4909-9d36-1b04c873e809","Type":"ContainerDied","Data":"6a5b1df7e31b44d00bef4eed33fbbdc1eb8405dfb1494a0cea5fc064b9d644c8"} Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.073047 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t54n" event={"ID":"ae038f22-b33b-4909-9d36-1b04c873e809","Type":"ContainerStarted","Data":"e4a0b32360ecfc1c816c80a96da0345c953ea5aff815954e0d4bbfd45bfd43b8"} Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.075226 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.078137 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lx59c" event={"ID":"aea19779-3ce8-4dad-8c79-fa74016ec424","Type":"ContainerStarted","Data":"3da05fadba38c9194c8bb9e89e69319c615833f13be0d2b5175b9bc031562526"} Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.089945 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"859dc7216057249ad7ba128ef7ee10f96f9b812ffb5b78702f04748f43db8d7f"} Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.091989 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.093855 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"efe52d2f1f194c79eecb90ed85a25a9f9267a5e4a662f66a233f4ed4f4cf52ac"} Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.098073 4845 generic.go:334] "Generic (PLEG): container finished" podID="3ab198b9-c91e-4d19-a404-88eece82b9d6" containerID="8effcacd8ab1f24fda40706bc47cd194cd72fc88d9f04111e2a227d56abc212e" exitCode=0 Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.098159 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnkmw" event={"ID":"3ab198b9-c91e-4d19-a404-88eece82b9d6","Type":"ContainerDied","Data":"8effcacd8ab1f24fda40706bc47cd194cd72fc88d9f04111e2a227d56abc212e"} Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.098951 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a787847d-91a6-42f6-8650-b9f7b4156eb3","Type":"ContainerStarted","Data":"69b56fad5d92c41d28e0c49056f1f1ff29a44eb8b50b447d594ad2cdc153acaa"} Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.101990 4845 generic.go:334] "Generic (PLEG): container finished" podID="293fea8c-d627-4852-84fb-c98e487df3a0" containerID="aa05ba6396230f9c785b1cb1486db092223209386e4b194a3af92abbcbbcb892" exitCode=0 Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.102511 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkq9g" event={"ID":"293fea8c-d627-4852-84fb-c98e487df3a0","Type":"ContainerDied","Data":"aa05ba6396230f9c785b1cb1486db092223209386e4b194a3af92abbcbbcb892"} Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.102539 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkq9g" event={"ID":"293fea8c-d627-4852-84fb-c98e487df3a0","Type":"ContainerStarted","Data":"a833b5ab807f78f8f965f11efc9ca0c5df54911fef8982acc5cd59f7ee7d0701"} Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.108605 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bp58w"] Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.110709 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.111315 4845 generic.go:334] "Generic (PLEG): container finished" podID="f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" containerID="737e2cd6623faf22a6a6fa0a553f5d4b553855a3e8c2771c99a5dc35f43f98af" exitCode=0 Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.111411 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n72q5" event={"ID":"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2","Type":"ContainerDied","Data":"737e2cd6623faf22a6a6fa0a553f5d4b553855a3e8c2771c99a5dc35f43f98af"} Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.111431 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n72q5" event={"ID":"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2","Type":"ContainerStarted","Data":"442d18c70ec3cf7c8a44dce1c229b7657abfdadca58f03b0c99a57c90d57a0e5"} Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.117503 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bcf276e87e36532e4c48e4072bd687493eebb48f4335bd21f10edc948a29af6c"} Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.122850 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.123789 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp58w"] Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.169547 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-lx59c" podStartSLOduration=11.169528533 podStartE2EDuration="11.169528533s" podCreationTimestamp="2025-10-06 06:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:47.137864773 +0000 UTC m=+151.652605781" watchObservedRunningTime="2025-10-06 06:47:47.169528533 +0000 UTC m=+151.684269541" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.192361 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192eb68c-c202-49cc-8fda-31df1ba7c691-catalog-content\") pod \"redhat-marketplace-bp58w\" (UID: \"192eb68c-c202-49cc-8fda-31df1ba7c691\") " pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.194141 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zj78\" (UniqueName: \"kubernetes.io/projected/192eb68c-c202-49cc-8fda-31df1ba7c691-kube-api-access-6zj78\") pod \"redhat-marketplace-bp58w\" (UID: \"192eb68c-c202-49cc-8fda-31df1ba7c691\") " pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.194403 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192eb68c-c202-49cc-8fda-31df1ba7c691-utilities\") pod \"redhat-marketplace-bp58w\" (UID: \"192eb68c-c202-49cc-8fda-31df1ba7c691\") " pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.298199 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192eb68c-c202-49cc-8fda-31df1ba7c691-catalog-content\") pod \"redhat-marketplace-bp58w\" (UID: \"192eb68c-c202-49cc-8fda-31df1ba7c691\") " pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.298318 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zj78\" (UniqueName: \"kubernetes.io/projected/192eb68c-c202-49cc-8fda-31df1ba7c691-kube-api-access-6zj78\") pod \"redhat-marketplace-bp58w\" (UID: \"192eb68c-c202-49cc-8fda-31df1ba7c691\") " pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.298424 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192eb68c-c202-49cc-8fda-31df1ba7c691-utilities\") pod \"redhat-marketplace-bp58w\" (UID: \"192eb68c-c202-49cc-8fda-31df1ba7c691\") " pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.299347 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192eb68c-c202-49cc-8fda-31df1ba7c691-utilities\") pod \"redhat-marketplace-bp58w\" (UID: \"192eb68c-c202-49cc-8fda-31df1ba7c691\") " pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.299840 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192eb68c-c202-49cc-8fda-31df1ba7c691-catalog-content\") pod \"redhat-marketplace-bp58w\" (UID: \"192eb68c-c202-49cc-8fda-31df1ba7c691\") " pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.315133 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9n6tj"] Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.322218 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zj78\" (UniqueName: \"kubernetes.io/projected/192eb68c-c202-49cc-8fda-31df1ba7c691-kube-api-access-6zj78\") pod \"redhat-marketplace-bp58w\" (UID: \"192eb68c-c202-49cc-8fda-31df1ba7c691\") " pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.385099 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqcvc"] Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.437316 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.659165 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp58w"] Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.933119 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:47 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:47 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:47 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:47 crc kubenswrapper[4845]: I1006 06:47:47.933205 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.106991 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n4jm5"] Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.108451 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.114060 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.114099 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4jm5"] Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.133896 4845 generic.go:334] "Generic (PLEG): container finished" podID="192eb68c-c202-49cc-8fda-31df1ba7c691" containerID="bed96113091d2e2af7371312cc384e9f264d5dafe2dfdc02503eb85965fcaf88" exitCode=0 Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.134534 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp58w" event={"ID":"192eb68c-c202-49cc-8fda-31df1ba7c691","Type":"ContainerDied","Data":"bed96113091d2e2af7371312cc384e9f264d5dafe2dfdc02503eb85965fcaf88"} Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.134702 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp58w" event={"ID":"192eb68c-c202-49cc-8fda-31df1ba7c691","Type":"ContainerStarted","Data":"0d5aa323569fd106bbfb44c408affe5000949f422cadc9a2c91e795f30e1af69"} Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.136270 4845 generic.go:334] "Generic (PLEG): container finished" podID="a787847d-91a6-42f6-8650-b9f7b4156eb3" containerID="0ec44d7cf3068cccf2634fa35ec457142eb055ebb535a38905b300aac9350676" exitCode=0 Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.136341 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a787847d-91a6-42f6-8650-b9f7b4156eb3","Type":"ContainerDied","Data":"0ec44d7cf3068cccf2634fa35ec457142eb055ebb535a38905b300aac9350676"} Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.142561 4845 generic.go:334] "Generic (PLEG): container finished" podID="1aa5e8be-a350-4d6d-b854-aabf6341b043" containerID="16207404271c86431b80c1f517800b7aad5775839295c4cec79f863ca27e055e" exitCode=0 Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.142652 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqcvc" event={"ID":"1aa5e8be-a350-4d6d-b854-aabf6341b043","Type":"ContainerDied","Data":"16207404271c86431b80c1f517800b7aad5775839295c4cec79f863ca27e055e"} Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.142689 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqcvc" event={"ID":"1aa5e8be-a350-4d6d-b854-aabf6341b043","Type":"ContainerStarted","Data":"9b373833f32de701c991abf4498448696658377c4c3e6373bfc45279ccadd614"} Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.148264 4845 generic.go:334] "Generic (PLEG): container finished" podID="5d009d0a-a266-4988-b410-9f0b99b66f2f" containerID="0c56dfb1bd6701038d808614418b1beead073ac4ec1e8a5f9d5f5c123af04dcc" exitCode=0 Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.148393 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" event={"ID":"5d009d0a-a266-4988-b410-9f0b99b66f2f","Type":"ContainerDied","Data":"0c56dfb1bd6701038d808614418b1beead073ac4ec1e8a5f9d5f5c123af04dcc"} Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.154646 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" event={"ID":"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11","Type":"ContainerStarted","Data":"a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713"} Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.155244 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.155267 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" event={"ID":"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11","Type":"ContainerStarted","Data":"ada81241639488e2403073aa2c5d497b9e86c82e021fecf4948ac3dfd0d1be49"} Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.185661 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.185720 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.187143 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" podStartSLOduration=131.187125742 podStartE2EDuration="2m11.187125742s" podCreationTimestamp="2025-10-06 06:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:48.181771336 +0000 UTC m=+152.696512344" watchObservedRunningTime="2025-10-06 06:47:48.187125742 +0000 UTC m=+152.701866760" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.193597 4845 patch_prober.go:28] interesting pod/console-f9d7485db-v6dlz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.193659 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v6dlz" podUID="55d881a0-9c07-42e7-aed8-8a883c4b1ff5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.210943 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7948z\" (UniqueName: \"kubernetes.io/projected/847bb591-c8a9-4c1a-bace-005ba77c1644-kube-api-access-7948z\") pod \"redhat-operators-n4jm5\" (UID: \"847bb591-c8a9-4c1a-bace-005ba77c1644\") " pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.211091 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847bb591-c8a9-4c1a-bace-005ba77c1644-catalog-content\") pod \"redhat-operators-n4jm5\" (UID: \"847bb591-c8a9-4c1a-bace-005ba77c1644\") " pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.211431 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847bb591-c8a9-4c1a-bace-005ba77c1644-utilities\") pod \"redhat-operators-n4jm5\" (UID: \"847bb591-c8a9-4c1a-bace-005ba77c1644\") " pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.248018 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.319150 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7948z\" (UniqueName: \"kubernetes.io/projected/847bb591-c8a9-4c1a-bace-005ba77c1644-kube-api-access-7948z\") pod \"redhat-operators-n4jm5\" (UID: \"847bb591-c8a9-4c1a-bace-005ba77c1644\") " pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.319228 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847bb591-c8a9-4c1a-bace-005ba77c1644-catalog-content\") pod \"redhat-operators-n4jm5\" (UID: \"847bb591-c8a9-4c1a-bace-005ba77c1644\") " pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.319254 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847bb591-c8a9-4c1a-bace-005ba77c1644-utilities\") pod \"redhat-operators-n4jm5\" (UID: \"847bb591-c8a9-4c1a-bace-005ba77c1644\") " pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.319817 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847bb591-c8a9-4c1a-bace-005ba77c1644-utilities\") pod \"redhat-operators-n4jm5\" (UID: \"847bb591-c8a9-4c1a-bace-005ba77c1644\") " pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.319756 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847bb591-c8a9-4c1a-bace-005ba77c1644-catalog-content\") pod \"redhat-operators-n4jm5\" (UID: \"847bb591-c8a9-4c1a-bace-005ba77c1644\") " pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.327263 4845 patch_prober.go:28] interesting pod/downloads-7954f5f757-47lzz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.327717 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-47lzz" podUID="d5c71177-1d9b-4f85-9431-1a4c421281ce" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.328330 4845 patch_prober.go:28] interesting pod/downloads-7954f5f757-47lzz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.328404 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-47lzz" podUID="d5c71177-1d9b-4f85-9431-1a4c421281ce" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.342752 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7948z\" (UniqueName: \"kubernetes.io/projected/847bb591-c8a9-4c1a-bace-005ba77c1644-kube-api-access-7948z\") pod \"redhat-operators-n4jm5\" (UID: \"847bb591-c8a9-4c1a-bace-005ba77c1644\") " pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.367622 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.367661 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.374144 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.470279 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.511110 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vbsmj"] Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.512320 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.531909 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vbsmj"] Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.628476 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526273f1-c5bb-49d8-828d-4957e03ee814-catalog-content\") pod \"redhat-operators-vbsmj\" (UID: \"526273f1-c5bb-49d8-828d-4957e03ee814\") " pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.628548 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526273f1-c5bb-49d8-828d-4957e03ee814-utilities\") pod \"redhat-operators-vbsmj\" (UID: \"526273f1-c5bb-49d8-828d-4957e03ee814\") " pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.628656 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c82t7\" (UniqueName: \"kubernetes.io/projected/526273f1-c5bb-49d8-828d-4957e03ee814-kube-api-access-c82t7\") pod \"redhat-operators-vbsmj\" (UID: \"526273f1-c5bb-49d8-828d-4957e03ee814\") " pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.735137 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c82t7\" (UniqueName: \"kubernetes.io/projected/526273f1-c5bb-49d8-828d-4957e03ee814-kube-api-access-c82t7\") pod \"redhat-operators-vbsmj\" (UID: \"526273f1-c5bb-49d8-828d-4957e03ee814\") " pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.735699 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526273f1-c5bb-49d8-828d-4957e03ee814-catalog-content\") pod \"redhat-operators-vbsmj\" (UID: \"526273f1-c5bb-49d8-828d-4957e03ee814\") " pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.735727 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526273f1-c5bb-49d8-828d-4957e03ee814-utilities\") pod \"redhat-operators-vbsmj\" (UID: \"526273f1-c5bb-49d8-828d-4957e03ee814\") " pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.736670 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526273f1-c5bb-49d8-828d-4957e03ee814-catalog-content\") pod \"redhat-operators-vbsmj\" (UID: \"526273f1-c5bb-49d8-828d-4957e03ee814\") " pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.736870 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526273f1-c5bb-49d8-828d-4957e03ee814-utilities\") pod \"redhat-operators-vbsmj\" (UID: \"526273f1-c5bb-49d8-828d-4957e03ee814\") " pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.756641 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c82t7\" (UniqueName: \"kubernetes.io/projected/526273f1-c5bb-49d8-828d-4957e03ee814-kube-api-access-c82t7\") pod \"redhat-operators-vbsmj\" (UID: \"526273f1-c5bb-49d8-828d-4957e03ee814\") " pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.771211 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.771247 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.777102 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.827709 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4jm5"] Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.842824 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.924520 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.929930 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:48 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:48 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:48 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:48 crc kubenswrapper[4845]: I1006 06:47:48.929971 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.043808 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vbsmj"] Oct 06 06:47:49 crc kubenswrapper[4845]: W1006 06:47:49.067615 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod526273f1_c5bb_49d8_828d_4957e03ee814.slice/crio-cc0f8e32f429fd5a1790b39ff94208c054891cd746b9a07aefed2c9c10dbdc9d WatchSource:0}: Error finding container cc0f8e32f429fd5a1790b39ff94208c054891cd746b9a07aefed2c9c10dbdc9d: Status 404 returned error can't find the container with id cc0f8e32f429fd5a1790b39ff94208c054891cd746b9a07aefed2c9c10dbdc9d Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.159585 4845 generic.go:334] "Generic (PLEG): container finished" podID="847bb591-c8a9-4c1a-bace-005ba77c1644" containerID="c976208d51a68962962aeaaaaff63ffe5249cbb0b745bc74e2e6d28dc1ccabdc" exitCode=0 Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.160072 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4jm5" event={"ID":"847bb591-c8a9-4c1a-bace-005ba77c1644","Type":"ContainerDied","Data":"c976208d51a68962962aeaaaaff63ffe5249cbb0b745bc74e2e6d28dc1ccabdc"} Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.160217 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4jm5" event={"ID":"847bb591-c8a9-4c1a-bace-005ba77c1644","Type":"ContainerStarted","Data":"c48c296b50e52b0da386fa9cd2c0db10106eef37c1e3a5edcaa8783750f0d522"} Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.162562 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbsmj" event={"ID":"526273f1-c5bb-49d8-828d-4957e03ee814","Type":"ContainerStarted","Data":"cc0f8e32f429fd5a1790b39ff94208c054891cd746b9a07aefed2c9c10dbdc9d"} Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.167745 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-htk6t" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.171413 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-hmpsg" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.572616 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.651978 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a787847d-91a6-42f6-8650-b9f7b4156eb3-kubelet-dir\") pod \"a787847d-91a6-42f6-8650-b9f7b4156eb3\" (UID: \"a787847d-91a6-42f6-8650-b9f7b4156eb3\") " Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.652083 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a787847d-91a6-42f6-8650-b9f7b4156eb3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a787847d-91a6-42f6-8650-b9f7b4156eb3" (UID: "a787847d-91a6-42f6-8650-b9f7b4156eb3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.652663 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a787847d-91a6-42f6-8650-b9f7b4156eb3-kube-api-access\") pod \"a787847d-91a6-42f6-8650-b9f7b4156eb3\" (UID: \"a787847d-91a6-42f6-8650-b9f7b4156eb3\") " Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.652767 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.653025 4845 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a787847d-91a6-42f6-8650-b9f7b4156eb3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.661345 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a787847d-91a6-42f6-8650-b9f7b4156eb3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a787847d-91a6-42f6-8650-b9f7b4156eb3" (UID: "a787847d-91a6-42f6-8650-b9f7b4156eb3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.755843 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d009d0a-a266-4988-b410-9f0b99b66f2f-config-volume\") pod \"5d009d0a-a266-4988-b410-9f0b99b66f2f\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.755913 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x7tj\" (UniqueName: \"kubernetes.io/projected/5d009d0a-a266-4988-b410-9f0b99b66f2f-kube-api-access-8x7tj\") pod \"5d009d0a-a266-4988-b410-9f0b99b66f2f\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.756057 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d009d0a-a266-4988-b410-9f0b99b66f2f-secret-volume\") pod \"5d009d0a-a266-4988-b410-9f0b99b66f2f\" (UID: \"5d009d0a-a266-4988-b410-9f0b99b66f2f\") " Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.756478 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a787847d-91a6-42f6-8650-b9f7b4156eb3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.757168 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d009d0a-a266-4988-b410-9f0b99b66f2f-config-volume" (OuterVolumeSpecName: "config-volume") pod "5d009d0a-a266-4988-b410-9f0b99b66f2f" (UID: "5d009d0a-a266-4988-b410-9f0b99b66f2f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.761103 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d009d0a-a266-4988-b410-9f0b99b66f2f-kube-api-access-8x7tj" (OuterVolumeSpecName: "kube-api-access-8x7tj") pod "5d009d0a-a266-4988-b410-9f0b99b66f2f" (UID: "5d009d0a-a266-4988-b410-9f0b99b66f2f"). InnerVolumeSpecName "kube-api-access-8x7tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.762393 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d009d0a-a266-4988-b410-9f0b99b66f2f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5d009d0a-a266-4988-b410-9f0b99b66f2f" (UID: "5d009d0a-a266-4988-b410-9f0b99b66f2f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.857614 4845 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d009d0a-a266-4988-b410-9f0b99b66f2f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.857645 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d009d0a-a266-4988-b410-9f0b99b66f2f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.857656 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x7tj\" (UniqueName: \"kubernetes.io/projected/5d009d0a-a266-4988-b410-9f0b99b66f2f-kube-api-access-8x7tj\") on node \"crc\" DevicePath \"\"" Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.932621 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:49 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:49 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:49 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:49 crc kubenswrapper[4845]: I1006 06:47:49.932676 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:50 crc kubenswrapper[4845]: I1006 06:47:50.177121 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" event={"ID":"5d009d0a-a266-4988-b410-9f0b99b66f2f","Type":"ContainerDied","Data":"46d1f531c0e03dad0f59a832b375934929b4d4915a2a2d2def999e6e09c9a34a"} Oct 06 06:47:50 crc kubenswrapper[4845]: I1006 06:47:50.177154 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd" Oct 06 06:47:50 crc kubenswrapper[4845]: I1006 06:47:50.177175 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46d1f531c0e03dad0f59a832b375934929b4d4915a2a2d2def999e6e09c9a34a" Oct 06 06:47:50 crc kubenswrapper[4845]: I1006 06:47:50.179946 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 06:47:50 crc kubenswrapper[4845]: I1006 06:47:50.179953 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a787847d-91a6-42f6-8650-b9f7b4156eb3","Type":"ContainerDied","Data":"69b56fad5d92c41d28e0c49056f1f1ff29a44eb8b50b447d594ad2cdc153acaa"} Oct 06 06:47:50 crc kubenswrapper[4845]: I1006 06:47:50.179997 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69b56fad5d92c41d28e0c49056f1f1ff29a44eb8b50b447d594ad2cdc153acaa" Oct 06 06:47:50 crc kubenswrapper[4845]: I1006 06:47:50.201826 4845 generic.go:334] "Generic (PLEG): container finished" podID="526273f1-c5bb-49d8-828d-4957e03ee814" containerID="ce0faab006057b8e3bd9aacdf400129683f74bfa3c78f9ab892386d31507cfd7" exitCode=0 Oct 06 06:47:50 crc kubenswrapper[4845]: I1006 06:47:50.202571 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbsmj" event={"ID":"526273f1-c5bb-49d8-828d-4957e03ee814","Type":"ContainerDied","Data":"ce0faab006057b8e3bd9aacdf400129683f74bfa3c78f9ab892386d31507cfd7"} Oct 06 06:47:50 crc kubenswrapper[4845]: I1006 06:47:50.928087 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:50 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:50 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:50 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:50 crc kubenswrapper[4845]: I1006 06:47:50.928178 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.273646 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kf7gv" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.339002 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 06:47:51 crc kubenswrapper[4845]: E1006 06:47:51.351642 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a787847d-91a6-42f6-8650-b9f7b4156eb3" containerName="pruner" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.351672 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a787847d-91a6-42f6-8650-b9f7b4156eb3" containerName="pruner" Oct 06 06:47:51 crc kubenswrapper[4845]: E1006 06:47:51.351691 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d009d0a-a266-4988-b410-9f0b99b66f2f" containerName="collect-profiles" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.351698 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d009d0a-a266-4988-b410-9f0b99b66f2f" containerName="collect-profiles" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.351791 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d009d0a-a266-4988-b410-9f0b99b66f2f" containerName="collect-profiles" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.351802 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a787847d-91a6-42f6-8650-b9f7b4156eb3" containerName="pruner" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.352128 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.352210 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.354590 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.355802 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.389432 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd62485b-5623-4974-b3dd-0fc57a2d1674-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fd62485b-5623-4974-b3dd-0fc57a2d1674\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.389545 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd62485b-5623-4974-b3dd-0fc57a2d1674-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fd62485b-5623-4974-b3dd-0fc57a2d1674\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.491008 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd62485b-5623-4974-b3dd-0fc57a2d1674-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fd62485b-5623-4974-b3dd-0fc57a2d1674\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.491477 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd62485b-5623-4974-b3dd-0fc57a2d1674-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fd62485b-5623-4974-b3dd-0fc57a2d1674\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.491148 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd62485b-5623-4974-b3dd-0fc57a2d1674-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fd62485b-5623-4974-b3dd-0fc57a2d1674\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.531057 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd62485b-5623-4974-b3dd-0fc57a2d1674-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fd62485b-5623-4974-b3dd-0fc57a2d1674\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.670545 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.927387 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:51 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:51 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:51 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:51 crc kubenswrapper[4845]: I1006 06:47:51.927462 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:52 crc kubenswrapper[4845]: I1006 06:47:52.126510 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 06:47:52 crc kubenswrapper[4845]: I1006 06:47:52.237664 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fd62485b-5623-4974-b3dd-0fc57a2d1674","Type":"ContainerStarted","Data":"41618b31075978cb62f01e87b8e1ac04a82c229b1c9d792e3fa5c87276246a9a"} Oct 06 06:47:52 crc kubenswrapper[4845]: I1006 06:47:52.928550 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:52 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:52 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:52 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:52 crc kubenswrapper[4845]: I1006 06:47:52.929009 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:53 crc kubenswrapper[4845]: I1006 06:47:53.019235 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:47:53 crc kubenswrapper[4845]: I1006 06:47:53.019324 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:47:53 crc kubenswrapper[4845]: I1006 06:47:53.240704 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fd62485b-5623-4974-b3dd-0fc57a2d1674","Type":"ContainerStarted","Data":"6e9a712f0158e20fba37f3b7a72c6d960fd33256380e977d0b644bde8c473887"} Oct 06 06:47:53 crc kubenswrapper[4845]: I1006 06:47:53.258753 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.258723249 podStartE2EDuration="2.258723249s" podCreationTimestamp="2025-10-06 06:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:47:53.257876428 +0000 UTC m=+157.772617436" watchObservedRunningTime="2025-10-06 06:47:53.258723249 +0000 UTC m=+157.773464257" Oct 06 06:47:53 crc kubenswrapper[4845]: I1006 06:47:53.928426 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:53 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Oct 06 06:47:53 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:53 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:53 crc kubenswrapper[4845]: I1006 06:47:53.928975 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:54 crc kubenswrapper[4845]: I1006 06:47:54.263585 4845 generic.go:334] "Generic (PLEG): container finished" podID="fd62485b-5623-4974-b3dd-0fc57a2d1674" containerID="6e9a712f0158e20fba37f3b7a72c6d960fd33256380e977d0b644bde8c473887" exitCode=0 Oct 06 06:47:54 crc kubenswrapper[4845]: I1006 06:47:54.263664 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fd62485b-5623-4974-b3dd-0fc57a2d1674","Type":"ContainerDied","Data":"6e9a712f0158e20fba37f3b7a72c6d960fd33256380e977d0b644bde8c473887"} Oct 06 06:47:54 crc kubenswrapper[4845]: I1006 06:47:54.927631 4845 patch_prober.go:28] interesting pod/router-default-5444994796-jrr8k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 06:47:54 crc kubenswrapper[4845]: [+]has-synced ok Oct 06 06:47:54 crc kubenswrapper[4845]: [+]process-running ok Oct 06 06:47:54 crc kubenswrapper[4845]: healthz check failed Oct 06 06:47:54 crc kubenswrapper[4845]: I1006 06:47:54.927717 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrr8k" podUID="82e84adb-480c-4357-aa9c-a92e1913f386" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 06:47:55 crc kubenswrapper[4845]: I1006 06:47:55.927170 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:55 crc kubenswrapper[4845]: I1006 06:47:55.929558 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jrr8k" Oct 06 06:47:58 crc kubenswrapper[4845]: I1006 06:47:58.181212 4845 patch_prober.go:28] interesting pod/console-f9d7485db-v6dlz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 06 06:47:58 crc kubenswrapper[4845]: I1006 06:47:58.181651 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v6dlz" podUID="55d881a0-9c07-42e7-aed8-8a883c4b1ff5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 06 06:47:58 crc kubenswrapper[4845]: I1006 06:47:58.335903 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-47lzz" Oct 06 06:48:00 crc kubenswrapper[4845]: I1006 06:48:00.162857 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:48:00 crc kubenswrapper[4845]: I1006 06:48:00.175549 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f80a2f04-a041-4acb-ace9-c0e40aed5f6d-metrics-certs\") pod \"network-metrics-daemon-4l7qj\" (UID: \"f80a2f04-a041-4acb-ace9-c0e40aed5f6d\") " pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:48:00 crc kubenswrapper[4845]: I1006 06:48:00.245871 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4l7qj" Oct 06 06:48:03 crc kubenswrapper[4845]: I1006 06:48:03.181694 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 06:48:03 crc kubenswrapper[4845]: I1006 06:48:03.205362 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd62485b-5623-4974-b3dd-0fc57a2d1674-kube-api-access\") pod \"fd62485b-5623-4974-b3dd-0fc57a2d1674\" (UID: \"fd62485b-5623-4974-b3dd-0fc57a2d1674\") " Oct 06 06:48:03 crc kubenswrapper[4845]: I1006 06:48:03.205583 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd62485b-5623-4974-b3dd-0fc57a2d1674-kubelet-dir\") pod \"fd62485b-5623-4974-b3dd-0fc57a2d1674\" (UID: \"fd62485b-5623-4974-b3dd-0fc57a2d1674\") " Oct 06 06:48:03 crc kubenswrapper[4845]: I1006 06:48:03.205707 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd62485b-5623-4974-b3dd-0fc57a2d1674-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fd62485b-5623-4974-b3dd-0fc57a2d1674" (UID: "fd62485b-5623-4974-b3dd-0fc57a2d1674"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:48:03 crc kubenswrapper[4845]: I1006 06:48:03.205965 4845 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd62485b-5623-4974-b3dd-0fc57a2d1674-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:03 crc kubenswrapper[4845]: I1006 06:48:03.216594 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd62485b-5623-4974-b3dd-0fc57a2d1674-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fd62485b-5623-4974-b3dd-0fc57a2d1674" (UID: "fd62485b-5623-4974-b3dd-0fc57a2d1674"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:48:03 crc kubenswrapper[4845]: I1006 06:48:03.307324 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd62485b-5623-4974-b3dd-0fc57a2d1674-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:03 crc kubenswrapper[4845]: I1006 06:48:03.330448 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fd62485b-5623-4974-b3dd-0fc57a2d1674","Type":"ContainerDied","Data":"41618b31075978cb62f01e87b8e1ac04a82c229b1c9d792e3fa5c87276246a9a"} Oct 06 06:48:03 crc kubenswrapper[4845]: I1006 06:48:03.330497 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41618b31075978cb62f01e87b8e1ac04a82c229b1c9d792e3fa5c87276246a9a" Oct 06 06:48:03 crc kubenswrapper[4845]: I1006 06:48:03.330775 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 06:48:06 crc kubenswrapper[4845]: I1006 06:48:06.864727 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:48:08 crc kubenswrapper[4845]: I1006 06:48:08.185112 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:48:08 crc kubenswrapper[4845]: I1006 06:48:08.193288 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:48:14 crc kubenswrapper[4845]: I1006 06:48:14.720809 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4l7qj"] Oct 06 06:48:17 crc kubenswrapper[4845]: I1006 06:48:17.459790 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" event={"ID":"f80a2f04-a041-4acb-ace9-c0e40aed5f6d","Type":"ContainerStarted","Data":"139d18dbaf52a743ba2b4c0cfdc8e9f0e24aace8bdcfa2f8107fb3e249470971"} Oct 06 06:48:18 crc kubenswrapper[4845]: E1006 06:48:18.972288 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 06:48:18 crc kubenswrapper[4845]: E1006 06:48:18.972619 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6zj78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bp58w_openshift-marketplace(192eb68c-c202-49cc-8fda-31df1ba7c691): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 06:48:18 crc kubenswrapper[4845]: E1006 06:48:18.973929 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bp58w" podUID="192eb68c-c202-49cc-8fda-31df1ba7c691" Oct 06 06:48:19 crc kubenswrapper[4845]: E1006 06:48:19.106929 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 06:48:19 crc kubenswrapper[4845]: E1006 06:48:19.107334 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pz7kh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-lqcvc_openshift-marketplace(1aa5e8be-a350-4d6d-b854-aabf6341b043): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 06:48:19 crc kubenswrapper[4845]: E1006 06:48:19.108541 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-lqcvc" podUID="1aa5e8be-a350-4d6d-b854-aabf6341b043" Oct 06 06:48:19 crc kubenswrapper[4845]: I1006 06:48:19.175607 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-777zb" Oct 06 06:48:19 crc kubenswrapper[4845]: I1006 06:48:19.473306 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t54n" event={"ID":"ae038f22-b33b-4909-9d36-1b04c873e809","Type":"ContainerStarted","Data":"b64d8e363857d6c06a42a166574c9478dd9202b34a90e58426050fad07864bb1"} Oct 06 06:48:19 crc kubenswrapper[4845]: I1006 06:48:19.475787 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4jm5" event={"ID":"847bb591-c8a9-4c1a-bace-005ba77c1644","Type":"ContainerStarted","Data":"d51323bbd0c00d4a9a5962704d014509c8f2e608e25340f3fc7323e32ef98f26"} Oct 06 06:48:19 crc kubenswrapper[4845]: I1006 06:48:19.478269 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbsmj" event={"ID":"526273f1-c5bb-49d8-828d-4957e03ee814","Type":"ContainerStarted","Data":"b44095ae985cd9a6777c586943bf4facc34b8c3e59f2681869f77ac5b349038c"} Oct 06 06:48:19 crc kubenswrapper[4845]: I1006 06:48:19.482017 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkq9g" event={"ID":"293fea8c-d627-4852-84fb-c98e487df3a0","Type":"ContainerStarted","Data":"e71f2aa7d40442431848e6757e543810c8e056cef1cdd3e08178cf841fc6e069"} Oct 06 06:48:19 crc kubenswrapper[4845]: I1006 06:48:19.487547 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n72q5" event={"ID":"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2","Type":"ContainerStarted","Data":"a9b252f00c73dae906120440f679a11ba24ccd299f134e14264aa1dfd83174af"} Oct 06 06:48:19 crc kubenswrapper[4845]: I1006 06:48:19.489006 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" event={"ID":"f80a2f04-a041-4acb-ace9-c0e40aed5f6d","Type":"ContainerStarted","Data":"f613caae6b22edad91477989d19cd4799b71d715e0354f805dae19e4c6199bba"} Oct 06 06:48:19 crc kubenswrapper[4845]: I1006 06:48:19.490900 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnkmw" event={"ID":"3ab198b9-c91e-4d19-a404-88eece82b9d6","Type":"ContainerStarted","Data":"4d05d8e1893d08ce6c0b3960931b1fb29cf79f404918f7295f8e3636ab985efe"} Oct 06 06:48:19 crc kubenswrapper[4845]: E1006 06:48:19.491337 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-lqcvc" podUID="1aa5e8be-a350-4d6d-b854-aabf6341b043" Oct 06 06:48:19 crc kubenswrapper[4845]: E1006 06:48:19.491726 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bp58w" podUID="192eb68c-c202-49cc-8fda-31df1ba7c691" Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.501319 4845 generic.go:334] "Generic (PLEG): container finished" podID="3ab198b9-c91e-4d19-a404-88eece82b9d6" containerID="4d05d8e1893d08ce6c0b3960931b1fb29cf79f404918f7295f8e3636ab985efe" exitCode=0 Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.501424 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnkmw" event={"ID":"3ab198b9-c91e-4d19-a404-88eece82b9d6","Type":"ContainerDied","Data":"4d05d8e1893d08ce6c0b3960931b1fb29cf79f404918f7295f8e3636ab985efe"} Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.508604 4845 generic.go:334] "Generic (PLEG): container finished" podID="847bb591-c8a9-4c1a-bace-005ba77c1644" containerID="d51323bbd0c00d4a9a5962704d014509c8f2e608e25340f3fc7323e32ef98f26" exitCode=0 Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.508713 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4jm5" event={"ID":"847bb591-c8a9-4c1a-bace-005ba77c1644","Type":"ContainerDied","Data":"d51323bbd0c00d4a9a5962704d014509c8f2e608e25340f3fc7323e32ef98f26"} Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.512695 4845 generic.go:334] "Generic (PLEG): container finished" podID="ae038f22-b33b-4909-9d36-1b04c873e809" containerID="b64d8e363857d6c06a42a166574c9478dd9202b34a90e58426050fad07864bb1" exitCode=0 Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.512764 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t54n" event={"ID":"ae038f22-b33b-4909-9d36-1b04c873e809","Type":"ContainerDied","Data":"b64d8e363857d6c06a42a166574c9478dd9202b34a90e58426050fad07864bb1"} Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.519514 4845 generic.go:334] "Generic (PLEG): container finished" podID="526273f1-c5bb-49d8-828d-4957e03ee814" containerID="b44095ae985cd9a6777c586943bf4facc34b8c3e59f2681869f77ac5b349038c" exitCode=0 Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.519597 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbsmj" event={"ID":"526273f1-c5bb-49d8-828d-4957e03ee814","Type":"ContainerDied","Data":"b44095ae985cd9a6777c586943bf4facc34b8c3e59f2681869f77ac5b349038c"} Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.528834 4845 generic.go:334] "Generic (PLEG): container finished" podID="293fea8c-d627-4852-84fb-c98e487df3a0" containerID="e71f2aa7d40442431848e6757e543810c8e056cef1cdd3e08178cf841fc6e069" exitCode=0 Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.529006 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkq9g" event={"ID":"293fea8c-d627-4852-84fb-c98e487df3a0","Type":"ContainerDied","Data":"e71f2aa7d40442431848e6757e543810c8e056cef1cdd3e08178cf841fc6e069"} Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.538806 4845 generic.go:334] "Generic (PLEG): container finished" podID="f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" containerID="a9b252f00c73dae906120440f679a11ba24ccd299f134e14264aa1dfd83174af" exitCode=0 Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.538881 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n72q5" event={"ID":"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2","Type":"ContainerDied","Data":"a9b252f00c73dae906120440f679a11ba24ccd299f134e14264aa1dfd83174af"} Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.542867 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4l7qj" event={"ID":"f80a2f04-a041-4acb-ace9-c0e40aed5f6d","Type":"ContainerStarted","Data":"06b1794ca5998342f6c1349982f8ecfd4cad17ab8b7d3ac40959048de6eb5141"} Oct 06 06:48:20 crc kubenswrapper[4845]: I1006 06:48:20.646271 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4l7qj" podStartSLOduration=164.646233161 podStartE2EDuration="2m44.646233161s" podCreationTimestamp="2025-10-06 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:48:20.635586982 +0000 UTC m=+185.150328010" watchObservedRunningTime="2025-10-06 06:48:20.646233161 +0000 UTC m=+185.160974219" Oct 06 06:48:23 crc kubenswrapper[4845]: I1006 06:48:23.019592 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:48:23 crc kubenswrapper[4845]: I1006 06:48:23.020213 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:48:23 crc kubenswrapper[4845]: I1006 06:48:23.563279 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnkmw" event={"ID":"3ab198b9-c91e-4d19-a404-88eece82b9d6","Type":"ContainerStarted","Data":"63aaf21bd28e1abc25deeab1f4d57fae0a47281fe82feea95d1234295a38bc9c"} Oct 06 06:48:24 crc kubenswrapper[4845]: I1006 06:48:24.546523 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 06:48:24 crc kubenswrapper[4845]: I1006 06:48:24.589887 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hnkmw" podStartSLOduration=4.848541255 podStartE2EDuration="40.589867521s" podCreationTimestamp="2025-10-06 06:47:44 +0000 UTC" firstStartedPulling="2025-10-06 06:47:47.100514368 +0000 UTC m=+151.615255366" lastFinishedPulling="2025-10-06 06:48:22.841840624 +0000 UTC m=+187.356581632" observedRunningTime="2025-10-06 06:48:24.584581657 +0000 UTC m=+189.099322745" watchObservedRunningTime="2025-10-06 06:48:24.589867521 +0000 UTC m=+189.104608529" Oct 06 06:48:25 crc kubenswrapper[4845]: I1006 06:48:25.043555 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:48:25 crc kubenswrapper[4845]: I1006 06:48:25.043639 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:48:25 crc kubenswrapper[4845]: I1006 06:48:25.578032 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbsmj" event={"ID":"526273f1-c5bb-49d8-828d-4957e03ee814","Type":"ContainerStarted","Data":"80f3efe0502585a915598ad7c1d9f31c3dea117f4294b91cf8ea2d2ef0bed761"} Oct 06 06:48:25 crc kubenswrapper[4845]: I1006 06:48:25.597663 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vbsmj" podStartSLOduration=3.540459905 podStartE2EDuration="37.597648831s" podCreationTimestamp="2025-10-06 06:47:48 +0000 UTC" firstStartedPulling="2025-10-06 06:47:50.209084153 +0000 UTC m=+154.723825161" lastFinishedPulling="2025-10-06 06:48:24.266273049 +0000 UTC m=+188.781014087" observedRunningTime="2025-10-06 06:48:25.596234746 +0000 UTC m=+190.110975774" watchObservedRunningTime="2025-10-06 06:48:25.597648831 +0000 UTC m=+190.112389839" Oct 06 06:48:26 crc kubenswrapper[4845]: I1006 06:48:26.893925 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hnkmw" podUID="3ab198b9-c91e-4d19-a404-88eece82b9d6" containerName="registry-server" probeResult="failure" output=< Oct 06 06:48:26 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Oct 06 06:48:26 crc kubenswrapper[4845]: > Oct 06 06:48:27 crc kubenswrapper[4845]: I1006 06:48:27.600233 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkq9g" event={"ID":"293fea8c-d627-4852-84fb-c98e487df3a0","Type":"ContainerStarted","Data":"c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798"} Oct 06 06:48:27 crc kubenswrapper[4845]: I1006 06:48:27.622133 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fkq9g" podStartSLOduration=3.999100331 podStartE2EDuration="42.622114397s" podCreationTimestamp="2025-10-06 06:47:45 +0000 UTC" firstStartedPulling="2025-10-06 06:47:47.104441977 +0000 UTC m=+151.619182995" lastFinishedPulling="2025-10-06 06:48:25.727456053 +0000 UTC m=+190.242197061" observedRunningTime="2025-10-06 06:48:27.621460551 +0000 UTC m=+192.136201559" watchObservedRunningTime="2025-10-06 06:48:27.622114397 +0000 UTC m=+192.136855405" Oct 06 06:48:28 crc kubenswrapper[4845]: I1006 06:48:28.608567 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n72q5" event={"ID":"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2","Type":"ContainerStarted","Data":"bd2a5b0c2394145e3b446f8f856603ed029dc3b6bf10976d3ba60b5b28ad92d2"} Oct 06 06:48:28 crc kubenswrapper[4845]: I1006 06:48:28.610810 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t54n" event={"ID":"ae038f22-b33b-4909-9d36-1b04c873e809","Type":"ContainerStarted","Data":"682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f"} Oct 06 06:48:28 crc kubenswrapper[4845]: I1006 06:48:28.612777 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4jm5" event={"ID":"847bb591-c8a9-4c1a-bace-005ba77c1644","Type":"ContainerStarted","Data":"74bb71199efb820c9f3b3f616caf5f2001eb67f9f178d6a35febbaacdd2a21f4"} Oct 06 06:48:28 crc kubenswrapper[4845]: I1006 06:48:28.626979 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n72q5" podStartSLOduration=3.908764623 podStartE2EDuration="44.626962783s" podCreationTimestamp="2025-10-06 06:47:44 +0000 UTC" firstStartedPulling="2025-10-06 06:47:47.113306732 +0000 UTC m=+151.628047740" lastFinishedPulling="2025-10-06 06:48:27.831504852 +0000 UTC m=+192.346245900" observedRunningTime="2025-10-06 06:48:28.625337992 +0000 UTC m=+193.140079000" watchObservedRunningTime="2025-10-06 06:48:28.626962783 +0000 UTC m=+193.141703791" Oct 06 06:48:28 crc kubenswrapper[4845]: I1006 06:48:28.645475 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6t54n" podStartSLOduration=4.632958696 podStartE2EDuration="43.64545505s" podCreationTimestamp="2025-10-06 06:47:45 +0000 UTC" firstStartedPulling="2025-10-06 06:47:47.07489328 +0000 UTC m=+151.589634288" lastFinishedPulling="2025-10-06 06:48:26.087389634 +0000 UTC m=+190.602130642" observedRunningTime="2025-10-06 06:48:28.642752002 +0000 UTC m=+193.157493010" watchObservedRunningTime="2025-10-06 06:48:28.64545505 +0000 UTC m=+193.160196058" Oct 06 06:48:28 crc kubenswrapper[4845]: I1006 06:48:28.665282 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n4jm5" podStartSLOduration=2.345953713 podStartE2EDuration="40.665255341s" podCreationTimestamp="2025-10-06 06:47:48 +0000 UTC" firstStartedPulling="2025-10-06 06:47:49.160877331 +0000 UTC m=+153.675618339" lastFinishedPulling="2025-10-06 06:48:27.480178939 +0000 UTC m=+191.994919967" observedRunningTime="2025-10-06 06:48:28.661761373 +0000 UTC m=+193.176502371" watchObservedRunningTime="2025-10-06 06:48:28.665255341 +0000 UTC m=+193.179996359" Oct 06 06:48:28 crc kubenswrapper[4845]: I1006 06:48:28.843296 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:48:28 crc kubenswrapper[4845]: I1006 06:48:28.843342 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:48:29 crc kubenswrapper[4845]: I1006 06:48:29.891071 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vbsmj" podUID="526273f1-c5bb-49d8-828d-4957e03ee814" containerName="registry-server" probeResult="failure" output=< Oct 06 06:48:29 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Oct 06 06:48:29 crc kubenswrapper[4845]: > Oct 06 06:48:31 crc kubenswrapper[4845]: I1006 06:48:31.629353 4845 generic.go:334] "Generic (PLEG): container finished" podID="1aa5e8be-a350-4d6d-b854-aabf6341b043" containerID="8a530401f63b97b62b4792a0200eef60e760e7149d699f56086eb442d4ff6c8c" exitCode=0 Oct 06 06:48:31 crc kubenswrapper[4845]: I1006 06:48:31.629415 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqcvc" event={"ID":"1aa5e8be-a350-4d6d-b854-aabf6341b043","Type":"ContainerDied","Data":"8a530401f63b97b62b4792a0200eef60e760e7149d699f56086eb442d4ff6c8c"} Oct 06 06:48:32 crc kubenswrapper[4845]: I1006 06:48:32.636688 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqcvc" event={"ID":"1aa5e8be-a350-4d6d-b854-aabf6341b043","Type":"ContainerStarted","Data":"baec662ae04680c134d87c2e6796cfd311ff0862a37aea3d9b30272c21fd91f5"} Oct 06 06:48:32 crc kubenswrapper[4845]: I1006 06:48:32.663668 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lqcvc" podStartSLOduration=2.725027524 podStartE2EDuration="46.663652003s" podCreationTimestamp="2025-10-06 06:47:46 +0000 UTC" firstStartedPulling="2025-10-06 06:47:48.144552005 +0000 UTC m=+152.659293013" lastFinishedPulling="2025-10-06 06:48:32.083176484 +0000 UTC m=+196.597917492" observedRunningTime="2025-10-06 06:48:32.659908008 +0000 UTC m=+197.174649036" watchObservedRunningTime="2025-10-06 06:48:32.663652003 +0000 UTC m=+197.178393011" Oct 06 06:48:34 crc kubenswrapper[4845]: I1006 06:48:34.646165 4845 generic.go:334] "Generic (PLEG): container finished" podID="192eb68c-c202-49cc-8fda-31df1ba7c691" containerID="3a0ba629532792cfe1d3b764c917d8463a423f2b0ab0565cec2d6c9fda79ee5d" exitCode=0 Oct 06 06:48:34 crc kubenswrapper[4845]: I1006 06:48:34.646880 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp58w" event={"ID":"192eb68c-c202-49cc-8fda-31df1ba7c691","Type":"ContainerDied","Data":"3a0ba629532792cfe1d3b764c917d8463a423f2b0ab0565cec2d6c9fda79ee5d"} Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.103040 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.147085 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.290406 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.290675 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.352927 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.554832 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.554879 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.620159 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.653619 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp58w" event={"ID":"192eb68c-c202-49cc-8fda-31df1ba7c691","Type":"ContainerStarted","Data":"2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2"} Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.660505 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bhflm"] Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.684319 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bp58w" podStartSLOduration=1.765608226 podStartE2EDuration="48.684300861s" podCreationTimestamp="2025-10-06 06:47:47 +0000 UTC" firstStartedPulling="2025-10-06 06:47:48.140311618 +0000 UTC m=+152.655052626" lastFinishedPulling="2025-10-06 06:48:35.059004253 +0000 UTC m=+199.573745261" observedRunningTime="2025-10-06 06:48:35.684040033 +0000 UTC m=+200.198781041" watchObservedRunningTime="2025-10-06 06:48:35.684300861 +0000 UTC m=+200.199041869" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.720230 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.727566 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.736989 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.737046 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:48:35 crc kubenswrapper[4845]: I1006 06:48:35.785295 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:48:36 crc kubenswrapper[4845]: I1006 06:48:36.694053 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:48:37 crc kubenswrapper[4845]: I1006 06:48:37.123992 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:48:37 crc kubenswrapper[4845]: I1006 06:48:37.124034 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:48:37 crc kubenswrapper[4845]: I1006 06:48:37.164554 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:48:37 crc kubenswrapper[4845]: I1006 06:48:37.438660 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:48:37 crc kubenswrapper[4845]: I1006 06:48:37.438695 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:48:37 crc kubenswrapper[4845]: I1006 06:48:37.476242 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:48:37 crc kubenswrapper[4845]: I1006 06:48:37.514835 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6t54n"] Oct 06 06:48:37 crc kubenswrapper[4845]: I1006 06:48:37.663217 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6t54n" podUID="ae038f22-b33b-4909-9d36-1b04c873e809" containerName="registry-server" containerID="cri-o://682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f" gracePeriod=2 Oct 06 06:48:37 crc kubenswrapper[4845]: I1006 06:48:37.711981 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.049309 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.115062 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fkq9g"] Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.185828 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcdd6\" (UniqueName: \"kubernetes.io/projected/ae038f22-b33b-4909-9d36-1b04c873e809-kube-api-access-vcdd6\") pod \"ae038f22-b33b-4909-9d36-1b04c873e809\" (UID: \"ae038f22-b33b-4909-9d36-1b04c873e809\") " Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.185895 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae038f22-b33b-4909-9d36-1b04c873e809-utilities\") pod \"ae038f22-b33b-4909-9d36-1b04c873e809\" (UID: \"ae038f22-b33b-4909-9d36-1b04c873e809\") " Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.185996 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae038f22-b33b-4909-9d36-1b04c873e809-catalog-content\") pod \"ae038f22-b33b-4909-9d36-1b04c873e809\" (UID: \"ae038f22-b33b-4909-9d36-1b04c873e809\") " Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.186755 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae038f22-b33b-4909-9d36-1b04c873e809-utilities" (OuterVolumeSpecName: "utilities") pod "ae038f22-b33b-4909-9d36-1b04c873e809" (UID: "ae038f22-b33b-4909-9d36-1b04c873e809"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.191774 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae038f22-b33b-4909-9d36-1b04c873e809-kube-api-access-vcdd6" (OuterVolumeSpecName: "kube-api-access-vcdd6") pod "ae038f22-b33b-4909-9d36-1b04c873e809" (UID: "ae038f22-b33b-4909-9d36-1b04c873e809"). InnerVolumeSpecName "kube-api-access-vcdd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.228943 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae038f22-b33b-4909-9d36-1b04c873e809-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae038f22-b33b-4909-9d36-1b04c873e809" (UID: "ae038f22-b33b-4909-9d36-1b04c873e809"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.287329 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcdd6\" (UniqueName: \"kubernetes.io/projected/ae038f22-b33b-4909-9d36-1b04c873e809-kube-api-access-vcdd6\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.287367 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae038f22-b33b-4909-9d36-1b04c873e809-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.287399 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae038f22-b33b-4909-9d36-1b04c873e809-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.471419 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.471466 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.515940 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.668589 4845 generic.go:334] "Generic (PLEG): container finished" podID="ae038f22-b33b-4909-9d36-1b04c873e809" containerID="682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f" exitCode=0 Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.668660 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6t54n" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.668685 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t54n" event={"ID":"ae038f22-b33b-4909-9d36-1b04c873e809","Type":"ContainerDied","Data":"682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f"} Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.668723 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6t54n" event={"ID":"ae038f22-b33b-4909-9d36-1b04c873e809","Type":"ContainerDied","Data":"e4a0b32360ecfc1c816c80a96da0345c953ea5aff815954e0d4bbfd45bfd43b8"} Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.668743 4845 scope.go:117] "RemoveContainer" containerID="682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.668816 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fkq9g" podUID="293fea8c-d627-4852-84fb-c98e487df3a0" containerName="registry-server" containerID="cri-o://c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798" gracePeriod=2 Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.685248 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6t54n"] Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.690414 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6t54n"] Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.695654 4845 scope.go:117] "RemoveContainer" containerID="b64d8e363857d6c06a42a166574c9478dd9202b34a90e58426050fad07864bb1" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.716844 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.725677 4845 scope.go:117] "RemoveContainer" containerID="6a5b1df7e31b44d00bef4eed33fbbdc1eb8405dfb1494a0cea5fc064b9d644c8" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.764464 4845 scope.go:117] "RemoveContainer" containerID="682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f" Oct 06 06:48:38 crc kubenswrapper[4845]: E1006 06:48:38.768687 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f\": container with ID starting with 682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f not found: ID does not exist" containerID="682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.768727 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f"} err="failed to get container status \"682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f\": rpc error: code = NotFound desc = could not find container \"682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f\": container with ID starting with 682d3f81f54ecfc60c47db8f383dbc4c5390addf3a349cbd120c6f61857f265f not found: ID does not exist" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.768776 4845 scope.go:117] "RemoveContainer" containerID="b64d8e363857d6c06a42a166574c9478dd9202b34a90e58426050fad07864bb1" Oct 06 06:48:38 crc kubenswrapper[4845]: E1006 06:48:38.769072 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b64d8e363857d6c06a42a166574c9478dd9202b34a90e58426050fad07864bb1\": container with ID starting with b64d8e363857d6c06a42a166574c9478dd9202b34a90e58426050fad07864bb1 not found: ID does not exist" containerID="b64d8e363857d6c06a42a166574c9478dd9202b34a90e58426050fad07864bb1" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.769092 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b64d8e363857d6c06a42a166574c9478dd9202b34a90e58426050fad07864bb1"} err="failed to get container status \"b64d8e363857d6c06a42a166574c9478dd9202b34a90e58426050fad07864bb1\": rpc error: code = NotFound desc = could not find container \"b64d8e363857d6c06a42a166574c9478dd9202b34a90e58426050fad07864bb1\": container with ID starting with b64d8e363857d6c06a42a166574c9478dd9202b34a90e58426050fad07864bb1 not found: ID does not exist" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.769105 4845 scope.go:117] "RemoveContainer" containerID="6a5b1df7e31b44d00bef4eed33fbbdc1eb8405dfb1494a0cea5fc064b9d644c8" Oct 06 06:48:38 crc kubenswrapper[4845]: E1006 06:48:38.769280 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a5b1df7e31b44d00bef4eed33fbbdc1eb8405dfb1494a0cea5fc064b9d644c8\": container with ID starting with 6a5b1df7e31b44d00bef4eed33fbbdc1eb8405dfb1494a0cea5fc064b9d644c8 not found: ID does not exist" containerID="6a5b1df7e31b44d00bef4eed33fbbdc1eb8405dfb1494a0cea5fc064b9d644c8" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.769303 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a5b1df7e31b44d00bef4eed33fbbdc1eb8405dfb1494a0cea5fc064b9d644c8"} err="failed to get container status \"6a5b1df7e31b44d00bef4eed33fbbdc1eb8405dfb1494a0cea5fc064b9d644c8\": rpc error: code = NotFound desc = could not find container \"6a5b1df7e31b44d00bef4eed33fbbdc1eb8405dfb1494a0cea5fc064b9d644c8\": container with ID starting with 6a5b1df7e31b44d00bef4eed33fbbdc1eb8405dfb1494a0cea5fc064b9d644c8 not found: ID does not exist" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.887242 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:48:38 crc kubenswrapper[4845]: I1006 06:48:38.936610 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.002332 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.196840 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293fea8c-d627-4852-84fb-c98e487df3a0-utilities\") pod \"293fea8c-d627-4852-84fb-c98e487df3a0\" (UID: \"293fea8c-d627-4852-84fb-c98e487df3a0\") " Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.196934 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7d66\" (UniqueName: \"kubernetes.io/projected/293fea8c-d627-4852-84fb-c98e487df3a0-kube-api-access-f7d66\") pod \"293fea8c-d627-4852-84fb-c98e487df3a0\" (UID: \"293fea8c-d627-4852-84fb-c98e487df3a0\") " Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.196996 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293fea8c-d627-4852-84fb-c98e487df3a0-catalog-content\") pod \"293fea8c-d627-4852-84fb-c98e487df3a0\" (UID: \"293fea8c-d627-4852-84fb-c98e487df3a0\") " Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.197618 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/293fea8c-d627-4852-84fb-c98e487df3a0-utilities" (OuterVolumeSpecName: "utilities") pod "293fea8c-d627-4852-84fb-c98e487df3a0" (UID: "293fea8c-d627-4852-84fb-c98e487df3a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.242357 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/293fea8c-d627-4852-84fb-c98e487df3a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "293fea8c-d627-4852-84fb-c98e487df3a0" (UID: "293fea8c-d627-4852-84fb-c98e487df3a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.263551 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/293fea8c-d627-4852-84fb-c98e487df3a0-kube-api-access-f7d66" (OuterVolumeSpecName: "kube-api-access-f7d66") pod "293fea8c-d627-4852-84fb-c98e487df3a0" (UID: "293fea8c-d627-4852-84fb-c98e487df3a0"). InnerVolumeSpecName "kube-api-access-f7d66". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.298559 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7d66\" (UniqueName: \"kubernetes.io/projected/293fea8c-d627-4852-84fb-c98e487df3a0-kube-api-access-f7d66\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.298587 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293fea8c-d627-4852-84fb-c98e487df3a0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.298599 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293fea8c-d627-4852-84fb-c98e487df3a0-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.678956 4845 generic.go:334] "Generic (PLEG): container finished" podID="293fea8c-d627-4852-84fb-c98e487df3a0" containerID="c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798" exitCode=0 Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.679007 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkq9g" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.679032 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkq9g" event={"ID":"293fea8c-d627-4852-84fb-c98e487df3a0","Type":"ContainerDied","Data":"c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798"} Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.679064 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkq9g" event={"ID":"293fea8c-d627-4852-84fb-c98e487df3a0","Type":"ContainerDied","Data":"a833b5ab807f78f8f965f11efc9ca0c5df54911fef8982acc5cd59f7ee7d0701"} Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.679083 4845 scope.go:117] "RemoveContainer" containerID="c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.697675 4845 scope.go:117] "RemoveContainer" containerID="e71f2aa7d40442431848e6757e543810c8e056cef1cdd3e08178cf841fc6e069" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.715814 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fkq9g"] Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.720502 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fkq9g"] Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.728972 4845 scope.go:117] "RemoveContainer" containerID="aa05ba6396230f9c785b1cb1486db092223209386e4b194a3af92abbcbbcb892" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.741098 4845 scope.go:117] "RemoveContainer" containerID="c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798" Oct 06 06:48:39 crc kubenswrapper[4845]: E1006 06:48:39.741430 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798\": container with ID starting with c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798 not found: ID does not exist" containerID="c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.741469 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798"} err="failed to get container status \"c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798\": rpc error: code = NotFound desc = could not find container \"c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798\": container with ID starting with c54606647957e0b25ff7902c145bfd32787ebe42aa04ff5a62b51befdcf2d798 not found: ID does not exist" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.741492 4845 scope.go:117] "RemoveContainer" containerID="e71f2aa7d40442431848e6757e543810c8e056cef1cdd3e08178cf841fc6e069" Oct 06 06:48:39 crc kubenswrapper[4845]: E1006 06:48:39.741891 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e71f2aa7d40442431848e6757e543810c8e056cef1cdd3e08178cf841fc6e069\": container with ID starting with e71f2aa7d40442431848e6757e543810c8e056cef1cdd3e08178cf841fc6e069 not found: ID does not exist" containerID="e71f2aa7d40442431848e6757e543810c8e056cef1cdd3e08178cf841fc6e069" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.741929 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e71f2aa7d40442431848e6757e543810c8e056cef1cdd3e08178cf841fc6e069"} err="failed to get container status \"e71f2aa7d40442431848e6757e543810c8e056cef1cdd3e08178cf841fc6e069\": rpc error: code = NotFound desc = could not find container \"e71f2aa7d40442431848e6757e543810c8e056cef1cdd3e08178cf841fc6e069\": container with ID starting with e71f2aa7d40442431848e6757e543810c8e056cef1cdd3e08178cf841fc6e069 not found: ID does not exist" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.741959 4845 scope.go:117] "RemoveContainer" containerID="aa05ba6396230f9c785b1cb1486db092223209386e4b194a3af92abbcbbcb892" Oct 06 06:48:39 crc kubenswrapper[4845]: E1006 06:48:39.742224 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa05ba6396230f9c785b1cb1486db092223209386e4b194a3af92abbcbbcb892\": container with ID starting with aa05ba6396230f9c785b1cb1486db092223209386e4b194a3af92abbcbbcb892 not found: ID does not exist" containerID="aa05ba6396230f9c785b1cb1486db092223209386e4b194a3af92abbcbbcb892" Oct 06 06:48:39 crc kubenswrapper[4845]: I1006 06:48:39.742253 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa05ba6396230f9c785b1cb1486db092223209386e4b194a3af92abbcbbcb892"} err="failed to get container status \"aa05ba6396230f9c785b1cb1486db092223209386e4b194a3af92abbcbbcb892\": rpc error: code = NotFound desc = could not find container \"aa05ba6396230f9c785b1cb1486db092223209386e4b194a3af92abbcbbcb892\": container with ID starting with aa05ba6396230f9c785b1cb1486db092223209386e4b194a3af92abbcbbcb892 not found: ID does not exist" Oct 06 06:48:40 crc kubenswrapper[4845]: I1006 06:48:40.233460 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="293fea8c-d627-4852-84fb-c98e487df3a0" path="/var/lib/kubelet/pods/293fea8c-d627-4852-84fb-c98e487df3a0/volumes" Oct 06 06:48:40 crc kubenswrapper[4845]: I1006 06:48:40.234251 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae038f22-b33b-4909-9d36-1b04c873e809" path="/var/lib/kubelet/pods/ae038f22-b33b-4909-9d36-1b04c873e809/volumes" Oct 06 06:48:42 crc kubenswrapper[4845]: I1006 06:48:42.515532 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vbsmj"] Oct 06 06:48:42 crc kubenswrapper[4845]: I1006 06:48:42.515995 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vbsmj" podUID="526273f1-c5bb-49d8-828d-4957e03ee814" containerName="registry-server" containerID="cri-o://80f3efe0502585a915598ad7c1d9f31c3dea117f4294b91cf8ea2d2ef0bed761" gracePeriod=2 Oct 06 06:48:42 crc kubenswrapper[4845]: I1006 06:48:42.701630 4845 generic.go:334] "Generic (PLEG): container finished" podID="526273f1-c5bb-49d8-828d-4957e03ee814" containerID="80f3efe0502585a915598ad7c1d9f31c3dea117f4294b91cf8ea2d2ef0bed761" exitCode=0 Oct 06 06:48:42 crc kubenswrapper[4845]: I1006 06:48:42.702224 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbsmj" event={"ID":"526273f1-c5bb-49d8-828d-4957e03ee814","Type":"ContainerDied","Data":"80f3efe0502585a915598ad7c1d9f31c3dea117f4294b91cf8ea2d2ef0bed761"} Oct 06 06:48:42 crc kubenswrapper[4845]: I1006 06:48:42.937960 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.042617 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526273f1-c5bb-49d8-828d-4957e03ee814-catalog-content\") pod \"526273f1-c5bb-49d8-828d-4957e03ee814\" (UID: \"526273f1-c5bb-49d8-828d-4957e03ee814\") " Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.042680 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526273f1-c5bb-49d8-828d-4957e03ee814-utilities\") pod \"526273f1-c5bb-49d8-828d-4957e03ee814\" (UID: \"526273f1-c5bb-49d8-828d-4957e03ee814\") " Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.042708 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c82t7\" (UniqueName: \"kubernetes.io/projected/526273f1-c5bb-49d8-828d-4957e03ee814-kube-api-access-c82t7\") pod \"526273f1-c5bb-49d8-828d-4957e03ee814\" (UID: \"526273f1-c5bb-49d8-828d-4957e03ee814\") " Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.043994 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/526273f1-c5bb-49d8-828d-4957e03ee814-utilities" (OuterVolumeSpecName: "utilities") pod "526273f1-c5bb-49d8-828d-4957e03ee814" (UID: "526273f1-c5bb-49d8-828d-4957e03ee814"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.049698 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/526273f1-c5bb-49d8-828d-4957e03ee814-kube-api-access-c82t7" (OuterVolumeSpecName: "kube-api-access-c82t7") pod "526273f1-c5bb-49d8-828d-4957e03ee814" (UID: "526273f1-c5bb-49d8-828d-4957e03ee814"). InnerVolumeSpecName "kube-api-access-c82t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.144243 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/526273f1-c5bb-49d8-828d-4957e03ee814-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "526273f1-c5bb-49d8-828d-4957e03ee814" (UID: "526273f1-c5bb-49d8-828d-4957e03ee814"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.145324 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/526273f1-c5bb-49d8-828d-4957e03ee814-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.145398 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c82t7\" (UniqueName: \"kubernetes.io/projected/526273f1-c5bb-49d8-828d-4957e03ee814-kube-api-access-c82t7\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.145430 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/526273f1-c5bb-49d8-828d-4957e03ee814-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.710850 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbsmj" event={"ID":"526273f1-c5bb-49d8-828d-4957e03ee814","Type":"ContainerDied","Data":"cc0f8e32f429fd5a1790b39ff94208c054891cd746b9a07aefed2c9c10dbdc9d"} Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.710912 4845 scope.go:117] "RemoveContainer" containerID="80f3efe0502585a915598ad7c1d9f31c3dea117f4294b91cf8ea2d2ef0bed761" Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.711053 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vbsmj" Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.742656 4845 scope.go:117] "RemoveContainer" containerID="b44095ae985cd9a6777c586943bf4facc34b8c3e59f2681869f77ac5b349038c" Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.751315 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vbsmj"] Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.756861 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vbsmj"] Oct 06 06:48:43 crc kubenswrapper[4845]: I1006 06:48:43.786152 4845 scope.go:117] "RemoveContainer" containerID="ce0faab006057b8e3bd9aacdf400129683f74bfa3c78f9ab892386d31507cfd7" Oct 06 06:48:44 crc kubenswrapper[4845]: I1006 06:48:44.233704 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="526273f1-c5bb-49d8-828d-4957e03ee814" path="/var/lib/kubelet/pods/526273f1-c5bb-49d8-828d-4957e03ee814/volumes" Oct 06 06:48:47 crc kubenswrapper[4845]: I1006 06:48:47.479460 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:48:49 crc kubenswrapper[4845]: I1006 06:48:49.917149 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp58w"] Oct 06 06:48:49 crc kubenswrapper[4845]: I1006 06:48:49.917806 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bp58w" podUID="192eb68c-c202-49cc-8fda-31df1ba7c691" containerName="registry-server" containerID="cri-o://2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2" gracePeriod=2 Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.336036 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.460646 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zj78\" (UniqueName: \"kubernetes.io/projected/192eb68c-c202-49cc-8fda-31df1ba7c691-kube-api-access-6zj78\") pod \"192eb68c-c202-49cc-8fda-31df1ba7c691\" (UID: \"192eb68c-c202-49cc-8fda-31df1ba7c691\") " Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.460732 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192eb68c-c202-49cc-8fda-31df1ba7c691-catalog-content\") pod \"192eb68c-c202-49cc-8fda-31df1ba7c691\" (UID: \"192eb68c-c202-49cc-8fda-31df1ba7c691\") " Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.460805 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192eb68c-c202-49cc-8fda-31df1ba7c691-utilities\") pod \"192eb68c-c202-49cc-8fda-31df1ba7c691\" (UID: \"192eb68c-c202-49cc-8fda-31df1ba7c691\") " Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.461785 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/192eb68c-c202-49cc-8fda-31df1ba7c691-utilities" (OuterVolumeSpecName: "utilities") pod "192eb68c-c202-49cc-8fda-31df1ba7c691" (UID: "192eb68c-c202-49cc-8fda-31df1ba7c691"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.466016 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/192eb68c-c202-49cc-8fda-31df1ba7c691-kube-api-access-6zj78" (OuterVolumeSpecName: "kube-api-access-6zj78") pod "192eb68c-c202-49cc-8fda-31df1ba7c691" (UID: "192eb68c-c202-49cc-8fda-31df1ba7c691"). InnerVolumeSpecName "kube-api-access-6zj78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.473409 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/192eb68c-c202-49cc-8fda-31df1ba7c691-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "192eb68c-c202-49cc-8fda-31df1ba7c691" (UID: "192eb68c-c202-49cc-8fda-31df1ba7c691"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.561974 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192eb68c-c202-49cc-8fda-31df1ba7c691-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.562012 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zj78\" (UniqueName: \"kubernetes.io/projected/192eb68c-c202-49cc-8fda-31df1ba7c691-kube-api-access-6zj78\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.562027 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192eb68c-c202-49cc-8fda-31df1ba7c691-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.771063 4845 generic.go:334] "Generic (PLEG): container finished" podID="192eb68c-c202-49cc-8fda-31df1ba7c691" containerID="2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2" exitCode=0 Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.771442 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp58w" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.771427 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp58w" event={"ID":"192eb68c-c202-49cc-8fda-31df1ba7c691","Type":"ContainerDied","Data":"2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2"} Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.771664 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp58w" event={"ID":"192eb68c-c202-49cc-8fda-31df1ba7c691","Type":"ContainerDied","Data":"0d5aa323569fd106bbfb44c408affe5000949f422cadc9a2c91e795f30e1af69"} Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.771687 4845 scope.go:117] "RemoveContainer" containerID="2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.788391 4845 scope.go:117] "RemoveContainer" containerID="3a0ba629532792cfe1d3b764c917d8463a423f2b0ab0565cec2d6c9fda79ee5d" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.801837 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp58w"] Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.804202 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp58w"] Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.826422 4845 scope.go:117] "RemoveContainer" containerID="bed96113091d2e2af7371312cc384e9f264d5dafe2dfdc02503eb85965fcaf88" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.844363 4845 scope.go:117] "RemoveContainer" containerID="2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2" Oct 06 06:48:50 crc kubenswrapper[4845]: E1006 06:48:50.844820 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2\": container with ID starting with 2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2 not found: ID does not exist" containerID="2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.844937 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2"} err="failed to get container status \"2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2\": rpc error: code = NotFound desc = could not find container \"2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2\": container with ID starting with 2e826622c1f67ba12f8ddb41b789d92b3330aef6bdfa2e8b3faa862164d47db2 not found: ID does not exist" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.845024 4845 scope.go:117] "RemoveContainer" containerID="3a0ba629532792cfe1d3b764c917d8463a423f2b0ab0565cec2d6c9fda79ee5d" Oct 06 06:48:50 crc kubenswrapper[4845]: E1006 06:48:50.845590 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0ba629532792cfe1d3b764c917d8463a423f2b0ab0565cec2d6c9fda79ee5d\": container with ID starting with 3a0ba629532792cfe1d3b764c917d8463a423f2b0ab0565cec2d6c9fda79ee5d not found: ID does not exist" containerID="3a0ba629532792cfe1d3b764c917d8463a423f2b0ab0565cec2d6c9fda79ee5d" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.845620 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0ba629532792cfe1d3b764c917d8463a423f2b0ab0565cec2d6c9fda79ee5d"} err="failed to get container status \"3a0ba629532792cfe1d3b764c917d8463a423f2b0ab0565cec2d6c9fda79ee5d\": rpc error: code = NotFound desc = could not find container \"3a0ba629532792cfe1d3b764c917d8463a423f2b0ab0565cec2d6c9fda79ee5d\": container with ID starting with 3a0ba629532792cfe1d3b764c917d8463a423f2b0ab0565cec2d6c9fda79ee5d not found: ID does not exist" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.845643 4845 scope.go:117] "RemoveContainer" containerID="bed96113091d2e2af7371312cc384e9f264d5dafe2dfdc02503eb85965fcaf88" Oct 06 06:48:50 crc kubenswrapper[4845]: E1006 06:48:50.846048 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed96113091d2e2af7371312cc384e9f264d5dafe2dfdc02503eb85965fcaf88\": container with ID starting with bed96113091d2e2af7371312cc384e9f264d5dafe2dfdc02503eb85965fcaf88 not found: ID does not exist" containerID="bed96113091d2e2af7371312cc384e9f264d5dafe2dfdc02503eb85965fcaf88" Oct 06 06:48:50 crc kubenswrapper[4845]: I1006 06:48:50.846073 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed96113091d2e2af7371312cc384e9f264d5dafe2dfdc02503eb85965fcaf88"} err="failed to get container status \"bed96113091d2e2af7371312cc384e9f264d5dafe2dfdc02503eb85965fcaf88\": rpc error: code = NotFound desc = could not find container \"bed96113091d2e2af7371312cc384e9f264d5dafe2dfdc02503eb85965fcaf88\": container with ID starting with bed96113091d2e2af7371312cc384e9f264d5dafe2dfdc02503eb85965fcaf88 not found: ID does not exist" Oct 06 06:48:52 crc kubenswrapper[4845]: I1006 06:48:52.232389 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="192eb68c-c202-49cc-8fda-31df1ba7c691" path="/var/lib/kubelet/pods/192eb68c-c202-49cc-8fda-31df1ba7c691/volumes" Oct 06 06:48:53 crc kubenswrapper[4845]: I1006 06:48:53.019163 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:48:53 crc kubenswrapper[4845]: I1006 06:48:53.019852 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:48:53 crc kubenswrapper[4845]: I1006 06:48:53.019931 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:48:53 crc kubenswrapper[4845]: I1006 06:48:53.020963 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 06:48:53 crc kubenswrapper[4845]: I1006 06:48:53.021060 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca" gracePeriod=600 Oct 06 06:48:53 crc kubenswrapper[4845]: I1006 06:48:53.795339 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca" exitCode=0 Oct 06 06:48:53 crc kubenswrapper[4845]: I1006 06:48:53.795438 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca"} Oct 06 06:48:53 crc kubenswrapper[4845]: I1006 06:48:53.796000 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"a91a13a44ef01ed072d381b9c9fc9a7b2fdaefa3297b1423272db8e0908ed6fd"} Oct 06 06:49:00 crc kubenswrapper[4845]: I1006 06:49:00.701104 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" podUID="4a5668eb-22b9-4eca-b0fa-6c53e83da118" containerName="oauth-openshift" containerID="cri-o://e322a1f6cf5a317fb7d282441cdccf3fb214da309bfab95a262d1c49642babfd" gracePeriod=15 Oct 06 06:49:00 crc kubenswrapper[4845]: I1006 06:49:00.840341 4845 generic.go:334] "Generic (PLEG): container finished" podID="4a5668eb-22b9-4eca-b0fa-6c53e83da118" containerID="e322a1f6cf5a317fb7d282441cdccf3fb214da309bfab95a262d1c49642babfd" exitCode=0 Oct 06 06:49:00 crc kubenswrapper[4845]: I1006 06:49:00.840403 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" event={"ID":"4a5668eb-22b9-4eca-b0fa-6c53e83da118","Type":"ContainerDied","Data":"e322a1f6cf5a317fb7d282441cdccf3fb214da309bfab95a262d1c49642babfd"} Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.080664 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.208554 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-cliconfig\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.208618 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-router-certs\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.208749 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-provider-selection\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.208801 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-service-ca\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.208834 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-session\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.208870 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-idp-0-file-data\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.208901 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-audit-policies\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.208925 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-error\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.208953 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5qg6\" (UniqueName: \"kubernetes.io/projected/4a5668eb-22b9-4eca-b0fa-6c53e83da118-kube-api-access-m5qg6\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.208977 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-serving-cert\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.208999 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-login\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.209028 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a5668eb-22b9-4eca-b0fa-6c53e83da118-audit-dir\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.209053 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-ocp-branding-template\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.209073 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-trusted-ca-bundle\") pod \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\" (UID: \"4a5668eb-22b9-4eca-b0fa-6c53e83da118\") " Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.209538 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.209625 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.209776 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.210042 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.210083 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a5668eb-22b9-4eca-b0fa-6c53e83da118-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.215413 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a5668eb-22b9-4eca-b0fa-6c53e83da118-kube-api-access-m5qg6" (OuterVolumeSpecName: "kube-api-access-m5qg6") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "kube-api-access-m5qg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.215962 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.216407 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.216754 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.219075 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.219831 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.220023 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.220944 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.221326 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4a5668eb-22b9-4eca-b0fa-6c53e83da118" (UID: "4a5668eb-22b9-4eca-b0fa-6c53e83da118"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.310922 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.312156 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.312274 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.312363 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.312530 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.312616 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.312712 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.312804 4845 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.312888 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5qg6\" (UniqueName: \"kubernetes.io/projected/4a5668eb-22b9-4eca-b0fa-6c53e83da118-kube-api-access-m5qg6\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.312968 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.313250 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.313339 4845 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a5668eb-22b9-4eca-b0fa-6c53e83da118-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.313447 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.313533 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a5668eb-22b9-4eca-b0fa-6c53e83da118-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.849705 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" event={"ID":"4a5668eb-22b9-4eca-b0fa-6c53e83da118","Type":"ContainerDied","Data":"dd2752b0ee2981662282121f2f43262a0e1d2f84a339ad7cd0e344738b50dbc0"} Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.849816 4845 scope.go:117] "RemoveContainer" containerID="e322a1f6cf5a317fb7d282441cdccf3fb214da309bfab95a262d1c49642babfd" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.851075 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bhflm" Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.920912 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bhflm"] Oct 06 06:49:01 crc kubenswrapper[4845]: I1006 06:49:01.923309 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bhflm"] Oct 06 06:49:02 crc kubenswrapper[4845]: I1006 06:49:02.236789 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a5668eb-22b9-4eca-b0fa-6c53e83da118" path="/var/lib/kubelet/pods/4a5668eb-22b9-4eca-b0fa-6c53e83da118/volumes" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070586 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-f58fb8db6-96xbm"] Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.070808 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae038f22-b33b-4909-9d36-1b04c873e809" containerName="registry-server" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070821 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae038f22-b33b-4909-9d36-1b04c873e809" containerName="registry-server" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.070834 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293fea8c-d627-4852-84fb-c98e487df3a0" containerName="extract-content" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070841 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="293fea8c-d627-4852-84fb-c98e487df3a0" containerName="extract-content" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.070852 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd62485b-5623-4974-b3dd-0fc57a2d1674" containerName="pruner" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070860 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd62485b-5623-4974-b3dd-0fc57a2d1674" containerName="pruner" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.070873 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5668eb-22b9-4eca-b0fa-6c53e83da118" containerName="oauth-openshift" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070879 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5668eb-22b9-4eca-b0fa-6c53e83da118" containerName="oauth-openshift" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.070888 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526273f1-c5bb-49d8-828d-4957e03ee814" containerName="extract-content" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070893 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="526273f1-c5bb-49d8-828d-4957e03ee814" containerName="extract-content" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.070899 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526273f1-c5bb-49d8-828d-4957e03ee814" containerName="registry-server" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070905 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="526273f1-c5bb-49d8-828d-4957e03ee814" containerName="registry-server" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.070915 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae038f22-b33b-4909-9d36-1b04c873e809" containerName="extract-utilities" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070922 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae038f22-b33b-4909-9d36-1b04c873e809" containerName="extract-utilities" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.070931 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192eb68c-c202-49cc-8fda-31df1ba7c691" containerName="registry-server" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070937 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="192eb68c-c202-49cc-8fda-31df1ba7c691" containerName="registry-server" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.070945 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526273f1-c5bb-49d8-828d-4957e03ee814" containerName="extract-utilities" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070951 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="526273f1-c5bb-49d8-828d-4957e03ee814" containerName="extract-utilities" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.070958 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192eb68c-c202-49cc-8fda-31df1ba7c691" containerName="extract-content" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070964 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="192eb68c-c202-49cc-8fda-31df1ba7c691" containerName="extract-content" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.070973 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293fea8c-d627-4852-84fb-c98e487df3a0" containerName="registry-server" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070980 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="293fea8c-d627-4852-84fb-c98e487df3a0" containerName="registry-server" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.070988 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae038f22-b33b-4909-9d36-1b04c873e809" containerName="extract-content" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.070994 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae038f22-b33b-4909-9d36-1b04c873e809" containerName="extract-content" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.071003 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192eb68c-c202-49cc-8fda-31df1ba7c691" containerName="extract-utilities" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.071008 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="192eb68c-c202-49cc-8fda-31df1ba7c691" containerName="extract-utilities" Oct 06 06:49:03 crc kubenswrapper[4845]: E1006 06:49:03.071016 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293fea8c-d627-4852-84fb-c98e487df3a0" containerName="extract-utilities" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.071022 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="293fea8c-d627-4852-84fb-c98e487df3a0" containerName="extract-utilities" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.071118 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="293fea8c-d627-4852-84fb-c98e487df3a0" containerName="registry-server" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.071127 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="192eb68c-c202-49cc-8fda-31df1ba7c691" containerName="registry-server" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.071140 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd62485b-5623-4974-b3dd-0fc57a2d1674" containerName="pruner" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.071148 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae038f22-b33b-4909-9d36-1b04c873e809" containerName="registry-server" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.071154 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a5668eb-22b9-4eca-b0fa-6c53e83da118" containerName="oauth-openshift" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.071166 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="526273f1-c5bb-49d8-828d-4957e03ee814" containerName="registry-server" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.071574 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.076107 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.076715 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.076877 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.076924 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.076719 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.077195 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.077308 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.077346 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.078433 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.078600 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.079486 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.087837 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.094231 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f58fb8db6-96xbm"] Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.094358 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.095920 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.110516 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.237917 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-router-certs\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.237976 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb91f5e6-dea2-4ad2-a572-76180bbf278b-audit-policies\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.238002 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.238184 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-user-template-login\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.238250 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-service-ca\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.238280 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s87m9\" (UniqueName: \"kubernetes.io/projected/fb91f5e6-dea2-4ad2-a572-76180bbf278b-kube-api-access-s87m9\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.238323 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-user-template-error\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.238339 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.238396 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.238425 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.238460 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-session\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.238592 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb91f5e6-dea2-4ad2-a572-76180bbf278b-audit-dir\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.238613 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.238633 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.339605 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-router-certs\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.339716 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb91f5e6-dea2-4ad2-a572-76180bbf278b-audit-policies\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.339760 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.339799 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-user-template-login\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.339862 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-service-ca\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.339910 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s87m9\" (UniqueName: \"kubernetes.io/projected/fb91f5e6-dea2-4ad2-a572-76180bbf278b-kube-api-access-s87m9\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.339950 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-user-template-error\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.339981 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.340024 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.340069 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.340112 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-session\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.340149 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb91f5e6-dea2-4ad2-a572-76180bbf278b-audit-dir\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.340183 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.340280 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.341048 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb91f5e6-dea2-4ad2-a572-76180bbf278b-audit-policies\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.341074 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-service-ca\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.341118 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb91f5e6-dea2-4ad2-a572-76180bbf278b-audit-dir\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.341223 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.343356 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.345823 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-user-template-error\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.346432 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.346727 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-router-certs\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.347173 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.347285 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-session\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.347442 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.347940 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.349261 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb91f5e6-dea2-4ad2-a572-76180bbf278b-v4-0-config-user-template-login\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.357643 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s87m9\" (UniqueName: \"kubernetes.io/projected/fb91f5e6-dea2-4ad2-a572-76180bbf278b-kube-api-access-s87m9\") pod \"oauth-openshift-f58fb8db6-96xbm\" (UID: \"fb91f5e6-dea2-4ad2-a572-76180bbf278b\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.406486 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.674941 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f58fb8db6-96xbm"] Oct 06 06:49:03 crc kubenswrapper[4845]: I1006 06:49:03.865232 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" event={"ID":"fb91f5e6-dea2-4ad2-a572-76180bbf278b","Type":"ContainerStarted","Data":"18fd0d2c36d26e4619ecd878bb2bfcf3389e2d9508c9ce41ac9c6eb430c87ae2"} Oct 06 06:49:04 crc kubenswrapper[4845]: I1006 06:49:04.870974 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" event={"ID":"fb91f5e6-dea2-4ad2-a572-76180bbf278b","Type":"ContainerStarted","Data":"46c852bb5d239cb2eafac3f3b5721e52777220d307d62dd1522c3300d9656e59"} Oct 06 06:49:04 crc kubenswrapper[4845]: I1006 06:49:04.871423 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:04 crc kubenswrapper[4845]: I1006 06:49:04.879599 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" Oct 06 06:49:04 crc kubenswrapper[4845]: I1006 06:49:04.946471 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-f58fb8db6-96xbm" podStartSLOduration=29.94644884 podStartE2EDuration="29.94644884s" podCreationTimestamp="2025-10-06 06:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:49:04.905580221 +0000 UTC m=+229.420321229" watchObservedRunningTime="2025-10-06 06:49:04.94644884 +0000 UTC m=+229.461189838" Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.605868 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hnkmw"] Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.607138 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hnkmw" podUID="3ab198b9-c91e-4d19-a404-88eece82b9d6" containerName="registry-server" containerID="cri-o://63aaf21bd28e1abc25deeab1f4d57fae0a47281fe82feea95d1234295a38bc9c" gracePeriod=30 Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.616100 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n72q5"] Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.616335 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n72q5" podUID="f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" containerName="registry-server" containerID="cri-o://bd2a5b0c2394145e3b446f8f856603ed029dc3b6bf10976d3ba60b5b28ad92d2" gracePeriod=30 Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.629260 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lljhd"] Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.629480 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" podUID="cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67" containerName="marketplace-operator" containerID="cri-o://de5e7492999bc4fa13c81e395e0870d42ad33c800304a9c7a997d0d06184ac5e" gracePeriod=30 Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.633362 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqcvc"] Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.633685 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lqcvc" podUID="1aa5e8be-a350-4d6d-b854-aabf6341b043" containerName="registry-server" containerID="cri-o://baec662ae04680c134d87c2e6796cfd311ff0862a37aea3d9b30272c21fd91f5" gracePeriod=30 Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.636879 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4jm5"] Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.637097 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n4jm5" podUID="847bb591-c8a9-4c1a-bace-005ba77c1644" containerName="registry-server" containerID="cri-o://74bb71199efb820c9f3b3f616caf5f2001eb67f9f178d6a35febbaacdd2a21f4" gracePeriod=30 Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.647472 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zxglv"] Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.648262 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.650648 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zxglv"] Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.748460 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs64z\" (UniqueName: \"kubernetes.io/projected/b5f2f19d-7dbf-4265-8e4c-96739b00f6e2-kube-api-access-zs64z\") pod \"marketplace-operator-79b997595-zxglv\" (UID: \"b5f2f19d-7dbf-4265-8e4c-96739b00f6e2\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.748814 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b5f2f19d-7dbf-4265-8e4c-96739b00f6e2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zxglv\" (UID: \"b5f2f19d-7dbf-4265-8e4c-96739b00f6e2\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.748864 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b5f2f19d-7dbf-4265-8e4c-96739b00f6e2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zxglv\" (UID: \"b5f2f19d-7dbf-4265-8e4c-96739b00f6e2\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.850462 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs64z\" (UniqueName: \"kubernetes.io/projected/b5f2f19d-7dbf-4265-8e4c-96739b00f6e2-kube-api-access-zs64z\") pod \"marketplace-operator-79b997595-zxglv\" (UID: \"b5f2f19d-7dbf-4265-8e4c-96739b00f6e2\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.850510 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b5f2f19d-7dbf-4265-8e4c-96739b00f6e2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zxglv\" (UID: \"b5f2f19d-7dbf-4265-8e4c-96739b00f6e2\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.850542 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b5f2f19d-7dbf-4265-8e4c-96739b00f6e2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zxglv\" (UID: \"b5f2f19d-7dbf-4265-8e4c-96739b00f6e2\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.851598 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b5f2f19d-7dbf-4265-8e4c-96739b00f6e2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zxglv\" (UID: \"b5f2f19d-7dbf-4265-8e4c-96739b00f6e2\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.857113 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b5f2f19d-7dbf-4265-8e4c-96739b00f6e2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zxglv\" (UID: \"b5f2f19d-7dbf-4265-8e4c-96739b00f6e2\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.867970 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs64z\" (UniqueName: \"kubernetes.io/projected/b5f2f19d-7dbf-4265-8e4c-96739b00f6e2-kube-api-access-zs64z\") pod \"marketplace-operator-79b997595-zxglv\" (UID: \"b5f2f19d-7dbf-4265-8e4c-96739b00f6e2\") " pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.960642 4845 generic.go:334] "Generic (PLEG): container finished" podID="cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67" containerID="de5e7492999bc4fa13c81e395e0870d42ad33c800304a9c7a997d0d06184ac5e" exitCode=0 Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.960732 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" event={"ID":"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67","Type":"ContainerDied","Data":"de5e7492999bc4fa13c81e395e0870d42ad33c800304a9c7a997d0d06184ac5e"} Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.962847 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.966368 4845 generic.go:334] "Generic (PLEG): container finished" podID="3ab198b9-c91e-4d19-a404-88eece82b9d6" containerID="63aaf21bd28e1abc25deeab1f4d57fae0a47281fe82feea95d1234295a38bc9c" exitCode=0 Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.966461 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnkmw" event={"ID":"3ab198b9-c91e-4d19-a404-88eece82b9d6","Type":"ContainerDied","Data":"63aaf21bd28e1abc25deeab1f4d57fae0a47281fe82feea95d1234295a38bc9c"} Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.968318 4845 generic.go:334] "Generic (PLEG): container finished" podID="847bb591-c8a9-4c1a-bace-005ba77c1644" containerID="74bb71199efb820c9f3b3f616caf5f2001eb67f9f178d6a35febbaacdd2a21f4" exitCode=0 Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.968368 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4jm5" event={"ID":"847bb591-c8a9-4c1a-bace-005ba77c1644","Type":"ContainerDied","Data":"74bb71199efb820c9f3b3f616caf5f2001eb67f9f178d6a35febbaacdd2a21f4"} Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.970459 4845 generic.go:334] "Generic (PLEG): container finished" podID="1aa5e8be-a350-4d6d-b854-aabf6341b043" containerID="baec662ae04680c134d87c2e6796cfd311ff0862a37aea3d9b30272c21fd91f5" exitCode=0 Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.970510 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqcvc" event={"ID":"1aa5e8be-a350-4d6d-b854-aabf6341b043","Type":"ContainerDied","Data":"baec662ae04680c134d87c2e6796cfd311ff0862a37aea3d9b30272c21fd91f5"} Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.975973 4845 generic.go:334] "Generic (PLEG): container finished" podID="f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" containerID="bd2a5b0c2394145e3b446f8f856603ed029dc3b6bf10976d3ba60b5b28ad92d2" exitCode=0 Oct 06 06:49:17 crc kubenswrapper[4845]: I1006 06:49:17.976041 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n72q5" event={"ID":"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2","Type":"ContainerDied","Data":"bd2a5b0c2394145e3b446f8f856603ed029dc3b6bf10976d3ba60b5b28ad92d2"} Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.109758 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.120002 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.145622 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.157992 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.187456 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.254261 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ab198b9-c91e-4d19-a404-88eece82b9d6-utilities\") pod \"3ab198b9-c91e-4d19-a404-88eece82b9d6\" (UID: \"3ab198b9-c91e-4d19-a404-88eece82b9d6\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.254296 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847bb591-c8a9-4c1a-bace-005ba77c1644-utilities\") pod \"847bb591-c8a9-4c1a-bace-005ba77c1644\" (UID: \"847bb591-c8a9-4c1a-bace-005ba77c1644\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.254322 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847bb591-c8a9-4c1a-bace-005ba77c1644-catalog-content\") pod \"847bb591-c8a9-4c1a-bace-005ba77c1644\" (UID: \"847bb591-c8a9-4c1a-bace-005ba77c1644\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.254350 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ab198b9-c91e-4d19-a404-88eece82b9d6-catalog-content\") pod \"3ab198b9-c91e-4d19-a404-88eece82b9d6\" (UID: \"3ab198b9-c91e-4d19-a404-88eece82b9d6\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.254414 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbf4j\" (UniqueName: \"kubernetes.io/projected/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-kube-api-access-bbf4j\") pod \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\" (UID: \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.254445 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8gt9\" (UniqueName: \"kubernetes.io/projected/3ab198b9-c91e-4d19-a404-88eece82b9d6-kube-api-access-l8gt9\") pod \"3ab198b9-c91e-4d19-a404-88eece82b9d6\" (UID: \"3ab198b9-c91e-4d19-a404-88eece82b9d6\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.255260 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ab198b9-c91e-4d19-a404-88eece82b9d6-utilities" (OuterVolumeSpecName: "utilities") pod "3ab198b9-c91e-4d19-a404-88eece82b9d6" (UID: "3ab198b9-c91e-4d19-a404-88eece82b9d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.255352 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/847bb591-c8a9-4c1a-bace-005ba77c1644-utilities" (OuterVolumeSpecName: "utilities") pod "847bb591-c8a9-4c1a-bace-005ba77c1644" (UID: "847bb591-c8a9-4c1a-bace-005ba77c1644"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.258781 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab198b9-c91e-4d19-a404-88eece82b9d6-kube-api-access-l8gt9" (OuterVolumeSpecName: "kube-api-access-l8gt9") pod "3ab198b9-c91e-4d19-a404-88eece82b9d6" (UID: "3ab198b9-c91e-4d19-a404-88eece82b9d6"). InnerVolumeSpecName "kube-api-access-l8gt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.258904 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-kube-api-access-bbf4j" (OuterVolumeSpecName: "kube-api-access-bbf4j") pod "f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" (UID: "f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2"). InnerVolumeSpecName "kube-api-access-bbf4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.260549 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa5e8be-a350-4d6d-b854-aabf6341b043-utilities\") pod \"1aa5e8be-a350-4d6d-b854-aabf6341b043\" (UID: \"1aa5e8be-a350-4d6d-b854-aabf6341b043\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.260643 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa5e8be-a350-4d6d-b854-aabf6341b043-catalog-content\") pod \"1aa5e8be-a350-4d6d-b854-aabf6341b043\" (UID: \"1aa5e8be-a350-4d6d-b854-aabf6341b043\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.261401 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7948z\" (UniqueName: \"kubernetes.io/projected/847bb591-c8a9-4c1a-bace-005ba77c1644-kube-api-access-7948z\") pod \"847bb591-c8a9-4c1a-bace-005ba77c1644\" (UID: \"847bb591-c8a9-4c1a-bace-005ba77c1644\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.261436 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-catalog-content\") pod \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\" (UID: \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.261513 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz7kh\" (UniqueName: \"kubernetes.io/projected/1aa5e8be-a350-4d6d-b854-aabf6341b043-kube-api-access-pz7kh\") pod \"1aa5e8be-a350-4d6d-b854-aabf6341b043\" (UID: \"1aa5e8be-a350-4d6d-b854-aabf6341b043\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.261544 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-utilities\") pod \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\" (UID: \"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.261864 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa5e8be-a350-4d6d-b854-aabf6341b043-utilities" (OuterVolumeSpecName: "utilities") pod "1aa5e8be-a350-4d6d-b854-aabf6341b043" (UID: "1aa5e8be-a350-4d6d-b854-aabf6341b043"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.262089 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ab198b9-c91e-4d19-a404-88eece82b9d6-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.262105 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/847bb591-c8a9-4c1a-bace-005ba77c1644-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.262115 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbf4j\" (UniqueName: \"kubernetes.io/projected/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-kube-api-access-bbf4j\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.262127 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8gt9\" (UniqueName: \"kubernetes.io/projected/3ab198b9-c91e-4d19-a404-88eece82b9d6-kube-api-access-l8gt9\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.262136 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa5e8be-a350-4d6d-b854-aabf6341b043-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.263107 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-utilities" (OuterVolumeSpecName: "utilities") pod "f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" (UID: "f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.266056 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa5e8be-a350-4d6d-b854-aabf6341b043-kube-api-access-pz7kh" (OuterVolumeSpecName: "kube-api-access-pz7kh") pod "1aa5e8be-a350-4d6d-b854-aabf6341b043" (UID: "1aa5e8be-a350-4d6d-b854-aabf6341b043"). InnerVolumeSpecName "kube-api-access-pz7kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.272267 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/847bb591-c8a9-4c1a-bace-005ba77c1644-kube-api-access-7948z" (OuterVolumeSpecName: "kube-api-access-7948z") pod "847bb591-c8a9-4c1a-bace-005ba77c1644" (UID: "847bb591-c8a9-4c1a-bace-005ba77c1644"). InnerVolumeSpecName "kube-api-access-7948z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.294759 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa5e8be-a350-4d6d-b854-aabf6341b043-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1aa5e8be-a350-4d6d-b854-aabf6341b043" (UID: "1aa5e8be-a350-4d6d-b854-aabf6341b043"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.328357 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" (UID: "f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.334964 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ab198b9-c91e-4d19-a404-88eece82b9d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ab198b9-c91e-4d19-a404-88eece82b9d6" (UID: "3ab198b9-c91e-4d19-a404-88eece82b9d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.363160 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7phnq\" (UniqueName: \"kubernetes.io/projected/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-kube-api-access-7phnq\") pod \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\" (UID: \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.363263 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-marketplace-trusted-ca\") pod \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\" (UID: \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.363301 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-marketplace-operator-metrics\") pod \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\" (UID: \"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67\") " Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.363524 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7948z\" (UniqueName: \"kubernetes.io/projected/847bb591-c8a9-4c1a-bace-005ba77c1644-kube-api-access-7948z\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.363552 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.363564 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz7kh\" (UniqueName: \"kubernetes.io/projected/1aa5e8be-a350-4d6d-b854-aabf6341b043-kube-api-access-pz7kh\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.363574 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.363585 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ab198b9-c91e-4d19-a404-88eece82b9d6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.363594 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa5e8be-a350-4d6d-b854-aabf6341b043-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.363975 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67" (UID: "cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.367120 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67" (UID: "cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.369389 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-kube-api-access-7phnq" (OuterVolumeSpecName: "kube-api-access-7phnq") pod "cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67" (UID: "cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67"). InnerVolumeSpecName "kube-api-access-7phnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.376923 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/847bb591-c8a9-4c1a-bace-005ba77c1644-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "847bb591-c8a9-4c1a-bace-005ba77c1644" (UID: "847bb591-c8a9-4c1a-bace-005ba77c1644"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.464454 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/847bb591-c8a9-4c1a-bace-005ba77c1644-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.464485 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7phnq\" (UniqueName: \"kubernetes.io/projected/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-kube-api-access-7phnq\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.464496 4845 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.464505 4845 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.483823 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zxglv"] Oct 06 06:49:18 crc kubenswrapper[4845]: W1006 06:49:18.507971 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5f2f19d_7dbf_4265_8e4c_96739b00f6e2.slice/crio-b6f12348c4fa83b82e4a9378a2c0c85e16a7d3dc0531d50e2482f30b18f24de4 WatchSource:0}: Error finding container b6f12348c4fa83b82e4a9378a2c0c85e16a7d3dc0531d50e2482f30b18f24de4: Status 404 returned error can't find the container with id b6f12348c4fa83b82e4a9378a2c0c85e16a7d3dc0531d50e2482f30b18f24de4 Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.982617 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4jm5" event={"ID":"847bb591-c8a9-4c1a-bace-005ba77c1644","Type":"ContainerDied","Data":"c48c296b50e52b0da386fa9cd2c0db10106eef37c1e3a5edcaa8783750f0d522"} Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.982875 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4jm5" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.982901 4845 scope.go:117] "RemoveContainer" containerID="74bb71199efb820c9f3b3f616caf5f2001eb67f9f178d6a35febbaacdd2a21f4" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.986054 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqcvc" event={"ID":"1aa5e8be-a350-4d6d-b854-aabf6341b043","Type":"ContainerDied","Data":"9b373833f32de701c991abf4498448696658377c4c3e6373bfc45279ccadd614"} Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.986136 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqcvc" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.987303 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" event={"ID":"b5f2f19d-7dbf-4265-8e4c-96739b00f6e2","Type":"ContainerStarted","Data":"472e2488d8573195f2ec2859f08a47940fe71d490a00f1052ad5351089773cf1"} Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.987347 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" event={"ID":"b5f2f19d-7dbf-4265-8e4c-96739b00f6e2","Type":"ContainerStarted","Data":"b6f12348c4fa83b82e4a9378a2c0c85e16a7d3dc0531d50e2482f30b18f24de4"} Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.987534 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.988429 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" event={"ID":"cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67","Type":"ContainerDied","Data":"94c39808c94f08ee0c79cb02694ec716dc89a1bc043addc0c7c7f120a0cf7d4f"} Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.988497 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lljhd" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.990813 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n72q5" event={"ID":"f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2","Type":"ContainerDied","Data":"442d18c70ec3cf7c8a44dce1c229b7657abfdadca58f03b0c99a57c90d57a0e5"} Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.990936 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n72q5" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.993349 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnkmw" event={"ID":"3ab198b9-c91e-4d19-a404-88eece82b9d6","Type":"ContainerDied","Data":"55141e59c522d29737414e4d68a8bba76e1f92fb3d2d85fe19320b1ddbce2f53"} Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.993432 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hnkmw" Oct 06 06:49:18 crc kubenswrapper[4845]: I1006 06:49:18.997248 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.012844 4845 scope.go:117] "RemoveContainer" containerID="d51323bbd0c00d4a9a5962704d014509c8f2e608e25340f3fc7323e32ef98f26" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.023132 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zxglv" podStartSLOduration=2.02310976 podStartE2EDuration="2.02310976s" podCreationTimestamp="2025-10-06 06:49:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:49:19.011468693 +0000 UTC m=+243.526209711" watchObservedRunningTime="2025-10-06 06:49:19.02310976 +0000 UTC m=+243.537850788" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.050518 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4jm5"] Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.053337 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n4jm5"] Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.061560 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqcvc"] Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.061902 4845 scope.go:117] "RemoveContainer" containerID="c976208d51a68962962aeaaaaff63ffe5249cbb0b745bc74e2e6d28dc1ccabdc" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.071698 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqcvc"] Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.074406 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lljhd"] Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.077283 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lljhd"] Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.087536 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hnkmw"] Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.090464 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hnkmw"] Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.091715 4845 scope.go:117] "RemoveContainer" containerID="baec662ae04680c134d87c2e6796cfd311ff0862a37aea3d9b30272c21fd91f5" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.099022 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n72q5"] Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.104824 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n72q5"] Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.111927 4845 scope.go:117] "RemoveContainer" containerID="8a530401f63b97b62b4792a0200eef60e760e7149d699f56086eb442d4ff6c8c" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.124242 4845 scope.go:117] "RemoveContainer" containerID="16207404271c86431b80c1f517800b7aad5775839295c4cec79f863ca27e055e" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.138280 4845 scope.go:117] "RemoveContainer" containerID="de5e7492999bc4fa13c81e395e0870d42ad33c800304a9c7a997d0d06184ac5e" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.154308 4845 scope.go:117] "RemoveContainer" containerID="bd2a5b0c2394145e3b446f8f856603ed029dc3b6bf10976d3ba60b5b28ad92d2" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.167070 4845 scope.go:117] "RemoveContainer" containerID="a9b252f00c73dae906120440f679a11ba24ccd299f134e14264aa1dfd83174af" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.182550 4845 scope.go:117] "RemoveContainer" containerID="737e2cd6623faf22a6a6fa0a553f5d4b553855a3e8c2771c99a5dc35f43f98af" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.194274 4845 scope.go:117] "RemoveContainer" containerID="63aaf21bd28e1abc25deeab1f4d57fae0a47281fe82feea95d1234295a38bc9c" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.225026 4845 scope.go:117] "RemoveContainer" containerID="4d05d8e1893d08ce6c0b3960931b1fb29cf79f404918f7295f8e3636ab985efe" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.258058 4845 scope.go:117] "RemoveContainer" containerID="8effcacd8ab1f24fda40706bc47cd194cd72fc88d9f04111e2a227d56abc212e" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.818999 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ctv9g"] Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819241 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa5e8be-a350-4d6d-b854-aabf6341b043" containerName="registry-server" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819252 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa5e8be-a350-4d6d-b854-aabf6341b043" containerName="registry-server" Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819262 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" containerName="extract-utilities" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819268 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" containerName="extract-utilities" Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819279 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa5e8be-a350-4d6d-b854-aabf6341b043" containerName="extract-utilities" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819285 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa5e8be-a350-4d6d-b854-aabf6341b043" containerName="extract-utilities" Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819295 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa5e8be-a350-4d6d-b854-aabf6341b043" containerName="extract-content" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819301 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa5e8be-a350-4d6d-b854-aabf6341b043" containerName="extract-content" Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819310 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847bb591-c8a9-4c1a-bace-005ba77c1644" containerName="extract-utilities" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819316 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="847bb591-c8a9-4c1a-bace-005ba77c1644" containerName="extract-utilities" Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819322 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab198b9-c91e-4d19-a404-88eece82b9d6" containerName="registry-server" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819328 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab198b9-c91e-4d19-a404-88eece82b9d6" containerName="registry-server" Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819338 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab198b9-c91e-4d19-a404-88eece82b9d6" containerName="extract-utilities" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819345 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab198b9-c91e-4d19-a404-88eece82b9d6" containerName="extract-utilities" Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819352 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" containerName="registry-server" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819358 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" containerName="registry-server" Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819367 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67" containerName="marketplace-operator" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819386 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67" containerName="marketplace-operator" Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819395 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847bb591-c8a9-4c1a-bace-005ba77c1644" containerName="registry-server" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819402 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="847bb591-c8a9-4c1a-bace-005ba77c1644" containerName="registry-server" Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819416 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847bb591-c8a9-4c1a-bace-005ba77c1644" containerName="extract-content" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819422 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="847bb591-c8a9-4c1a-bace-005ba77c1644" containerName="extract-content" Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819430 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab198b9-c91e-4d19-a404-88eece82b9d6" containerName="extract-content" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819436 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab198b9-c91e-4d19-a404-88eece82b9d6" containerName="extract-content" Oct 06 06:49:19 crc kubenswrapper[4845]: E1006 06:49:19.819445 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" containerName="extract-content" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819451 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" containerName="extract-content" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819536 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ab198b9-c91e-4d19-a404-88eece82b9d6" containerName="registry-server" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819549 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa5e8be-a350-4d6d-b854-aabf6341b043" containerName="registry-server" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819560 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" containerName="registry-server" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819568 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67" containerName="marketplace-operator" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.819580 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="847bb591-c8a9-4c1a-bace-005ba77c1644" containerName="registry-server" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.820349 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.823878 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctv9g"] Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.824254 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.981554 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c769aa66-5169-4ec2-8993-540bd8bfcfca-catalog-content\") pod \"redhat-marketplace-ctv9g\" (UID: \"c769aa66-5169-4ec2-8993-540bd8bfcfca\") " pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.981956 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxlsg\" (UniqueName: \"kubernetes.io/projected/c769aa66-5169-4ec2-8993-540bd8bfcfca-kube-api-access-dxlsg\") pod \"redhat-marketplace-ctv9g\" (UID: \"c769aa66-5169-4ec2-8993-540bd8bfcfca\") " pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:19 crc kubenswrapper[4845]: I1006 06:49:19.981998 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c769aa66-5169-4ec2-8993-540bd8bfcfca-utilities\") pod \"redhat-marketplace-ctv9g\" (UID: \"c769aa66-5169-4ec2-8993-540bd8bfcfca\") " pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.017001 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-msb4z"] Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.019006 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.022230 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-msb4z"] Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.022700 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.083075 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c769aa66-5169-4ec2-8993-540bd8bfcfca-catalog-content\") pod \"redhat-marketplace-ctv9g\" (UID: \"c769aa66-5169-4ec2-8993-540bd8bfcfca\") " pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.083175 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxlsg\" (UniqueName: \"kubernetes.io/projected/c769aa66-5169-4ec2-8993-540bd8bfcfca-kube-api-access-dxlsg\") pod \"redhat-marketplace-ctv9g\" (UID: \"c769aa66-5169-4ec2-8993-540bd8bfcfca\") " pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.083214 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c769aa66-5169-4ec2-8993-540bd8bfcfca-utilities\") pod \"redhat-marketplace-ctv9g\" (UID: \"c769aa66-5169-4ec2-8993-540bd8bfcfca\") " pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.084343 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c769aa66-5169-4ec2-8993-540bd8bfcfca-catalog-content\") pod \"redhat-marketplace-ctv9g\" (UID: \"c769aa66-5169-4ec2-8993-540bd8bfcfca\") " pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.084866 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c769aa66-5169-4ec2-8993-540bd8bfcfca-utilities\") pod \"redhat-marketplace-ctv9g\" (UID: \"c769aa66-5169-4ec2-8993-540bd8bfcfca\") " pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.104066 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxlsg\" (UniqueName: \"kubernetes.io/projected/c769aa66-5169-4ec2-8993-540bd8bfcfca-kube-api-access-dxlsg\") pod \"redhat-marketplace-ctv9g\" (UID: \"c769aa66-5169-4ec2-8993-540bd8bfcfca\") " pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.142004 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.184965 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738fc958-3a60-4780-aa02-8af7f6887aa6-utilities\") pod \"redhat-operators-msb4z\" (UID: \"738fc958-3a60-4780-aa02-8af7f6887aa6\") " pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.185018 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l52g\" (UniqueName: \"kubernetes.io/projected/738fc958-3a60-4780-aa02-8af7f6887aa6-kube-api-access-8l52g\") pod \"redhat-operators-msb4z\" (UID: \"738fc958-3a60-4780-aa02-8af7f6887aa6\") " pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.185114 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738fc958-3a60-4780-aa02-8af7f6887aa6-catalog-content\") pod \"redhat-operators-msb4z\" (UID: \"738fc958-3a60-4780-aa02-8af7f6887aa6\") " pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.236958 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa5e8be-a350-4d6d-b854-aabf6341b043" path="/var/lib/kubelet/pods/1aa5e8be-a350-4d6d-b854-aabf6341b043/volumes" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.237985 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab198b9-c91e-4d19-a404-88eece82b9d6" path="/var/lib/kubelet/pods/3ab198b9-c91e-4d19-a404-88eece82b9d6/volumes" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.238691 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="847bb591-c8a9-4c1a-bace-005ba77c1644" path="/var/lib/kubelet/pods/847bb591-c8a9-4c1a-bace-005ba77c1644/volumes" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.241116 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67" path="/var/lib/kubelet/pods/cbf6442d-181f-4c2d-b5b5-22b1b2d6cd67/volumes" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.241593 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2" path="/var/lib/kubelet/pods/f2eeb1a7-98d0-4b52-9c07-f9ba90f486e2/volumes" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.286081 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738fc958-3a60-4780-aa02-8af7f6887aa6-utilities\") pod \"redhat-operators-msb4z\" (UID: \"738fc958-3a60-4780-aa02-8af7f6887aa6\") " pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.286521 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738fc958-3a60-4780-aa02-8af7f6887aa6-utilities\") pod \"redhat-operators-msb4z\" (UID: \"738fc958-3a60-4780-aa02-8af7f6887aa6\") " pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.286565 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l52g\" (UniqueName: \"kubernetes.io/projected/738fc958-3a60-4780-aa02-8af7f6887aa6-kube-api-access-8l52g\") pod \"redhat-operators-msb4z\" (UID: \"738fc958-3a60-4780-aa02-8af7f6887aa6\") " pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.286597 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738fc958-3a60-4780-aa02-8af7f6887aa6-catalog-content\") pod \"redhat-operators-msb4z\" (UID: \"738fc958-3a60-4780-aa02-8af7f6887aa6\") " pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.287442 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738fc958-3a60-4780-aa02-8af7f6887aa6-catalog-content\") pod \"redhat-operators-msb4z\" (UID: \"738fc958-3a60-4780-aa02-8af7f6887aa6\") " pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.311779 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l52g\" (UniqueName: \"kubernetes.io/projected/738fc958-3a60-4780-aa02-8af7f6887aa6-kube-api-access-8l52g\") pod \"redhat-operators-msb4z\" (UID: \"738fc958-3a60-4780-aa02-8af7f6887aa6\") " pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.335092 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.505243 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-msb4z"] Oct 06 06:49:20 crc kubenswrapper[4845]: I1006 06:49:20.540395 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctv9g"] Oct 06 06:49:20 crc kubenswrapper[4845]: W1006 06:49:20.545199 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc769aa66_5169_4ec2_8993_540bd8bfcfca.slice/crio-84bad291b6ffdb7cabc71a43aa694fa62dc5c8e4982e7364c9f0d4c536618cd0 WatchSource:0}: Error finding container 84bad291b6ffdb7cabc71a43aa694fa62dc5c8e4982e7364c9f0d4c536618cd0: Status 404 returned error can't find the container with id 84bad291b6ffdb7cabc71a43aa694fa62dc5c8e4982e7364c9f0d4c536618cd0 Oct 06 06:49:21 crc kubenswrapper[4845]: I1006 06:49:21.011769 4845 generic.go:334] "Generic (PLEG): container finished" podID="738fc958-3a60-4780-aa02-8af7f6887aa6" containerID="648960f9ba7bb94be46feb14373218aa5482ecdfa07e7f65845dbf76d6f928e5" exitCode=0 Oct 06 06:49:21 crc kubenswrapper[4845]: I1006 06:49:21.011842 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-msb4z" event={"ID":"738fc958-3a60-4780-aa02-8af7f6887aa6","Type":"ContainerDied","Data":"648960f9ba7bb94be46feb14373218aa5482ecdfa07e7f65845dbf76d6f928e5"} Oct 06 06:49:21 crc kubenswrapper[4845]: I1006 06:49:21.011870 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-msb4z" event={"ID":"738fc958-3a60-4780-aa02-8af7f6887aa6","Type":"ContainerStarted","Data":"b53bf6716aacb4ee690ca6c337348499c0e2473941db7c535bcaf1954ce8babb"} Oct 06 06:49:21 crc kubenswrapper[4845]: I1006 06:49:21.013304 4845 generic.go:334] "Generic (PLEG): container finished" podID="c769aa66-5169-4ec2-8993-540bd8bfcfca" containerID="dfd9fe1d59a8a3d95b768dfd787f5f8660906c8118a6c4796aeb8d6011945381" exitCode=0 Oct 06 06:49:21 crc kubenswrapper[4845]: I1006 06:49:21.013363 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctv9g" event={"ID":"c769aa66-5169-4ec2-8993-540bd8bfcfca","Type":"ContainerDied","Data":"dfd9fe1d59a8a3d95b768dfd787f5f8660906c8118a6c4796aeb8d6011945381"} Oct 06 06:49:21 crc kubenswrapper[4845]: I1006 06:49:21.013446 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctv9g" event={"ID":"c769aa66-5169-4ec2-8993-540bd8bfcfca","Type":"ContainerStarted","Data":"84bad291b6ffdb7cabc71a43aa694fa62dc5c8e4982e7364c9f0d4c536618cd0"} Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.039900 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-msb4z" event={"ID":"738fc958-3a60-4780-aa02-8af7f6887aa6","Type":"ContainerStarted","Data":"b81618cf952c440c1db9a4414b0a0e90afd0e51655fb0677841ecd475784dda3"} Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.214543 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9qgph"] Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.219053 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.221763 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.221963 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9qgph"] Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.308884 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pttfh\" (UniqueName: \"kubernetes.io/projected/ec5e118e-758c-4120-a070-fe923261cadc-kube-api-access-pttfh\") pod \"community-operators-9qgph\" (UID: \"ec5e118e-758c-4120-a070-fe923261cadc\") " pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.309051 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec5e118e-758c-4120-a070-fe923261cadc-utilities\") pod \"community-operators-9qgph\" (UID: \"ec5e118e-758c-4120-a070-fe923261cadc\") " pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.309225 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec5e118e-758c-4120-a070-fe923261cadc-catalog-content\") pod \"community-operators-9qgph\" (UID: \"ec5e118e-758c-4120-a070-fe923261cadc\") " pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.410923 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec5e118e-758c-4120-a070-fe923261cadc-utilities\") pod \"community-operators-9qgph\" (UID: \"ec5e118e-758c-4120-a070-fe923261cadc\") " pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.411002 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec5e118e-758c-4120-a070-fe923261cadc-catalog-content\") pod \"community-operators-9qgph\" (UID: \"ec5e118e-758c-4120-a070-fe923261cadc\") " pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.411032 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pttfh\" (UniqueName: \"kubernetes.io/projected/ec5e118e-758c-4120-a070-fe923261cadc-kube-api-access-pttfh\") pod \"community-operators-9qgph\" (UID: \"ec5e118e-758c-4120-a070-fe923261cadc\") " pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.411966 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec5e118e-758c-4120-a070-fe923261cadc-utilities\") pod \"community-operators-9qgph\" (UID: \"ec5e118e-758c-4120-a070-fe923261cadc\") " pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.412367 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec5e118e-758c-4120-a070-fe923261cadc-catalog-content\") pod \"community-operators-9qgph\" (UID: \"ec5e118e-758c-4120-a070-fe923261cadc\") " pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.415078 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m7kmk"] Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.416345 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.418105 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.430900 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m7kmk"] Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.434813 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pttfh\" (UniqueName: \"kubernetes.io/projected/ec5e118e-758c-4120-a070-fe923261cadc-kube-api-access-pttfh\") pod \"community-operators-9qgph\" (UID: \"ec5e118e-758c-4120-a070-fe923261cadc\") " pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.512044 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccce060c-c044-45d8-8d3c-92cc9e40198a-catalog-content\") pod \"certified-operators-m7kmk\" (UID: \"ccce060c-c044-45d8-8d3c-92cc9e40198a\") " pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.512175 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccce060c-c044-45d8-8d3c-92cc9e40198a-utilities\") pod \"certified-operators-m7kmk\" (UID: \"ccce060c-c044-45d8-8d3c-92cc9e40198a\") " pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.512256 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p5vh\" (UniqueName: \"kubernetes.io/projected/ccce060c-c044-45d8-8d3c-92cc9e40198a-kube-api-access-9p5vh\") pod \"certified-operators-m7kmk\" (UID: \"ccce060c-c044-45d8-8d3c-92cc9e40198a\") " pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.563135 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.613652 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccce060c-c044-45d8-8d3c-92cc9e40198a-utilities\") pod \"certified-operators-m7kmk\" (UID: \"ccce060c-c044-45d8-8d3c-92cc9e40198a\") " pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.613851 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p5vh\" (UniqueName: \"kubernetes.io/projected/ccce060c-c044-45d8-8d3c-92cc9e40198a-kube-api-access-9p5vh\") pod \"certified-operators-m7kmk\" (UID: \"ccce060c-c044-45d8-8d3c-92cc9e40198a\") " pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.614047 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccce060c-c044-45d8-8d3c-92cc9e40198a-catalog-content\") pod \"certified-operators-m7kmk\" (UID: \"ccce060c-c044-45d8-8d3c-92cc9e40198a\") " pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.614566 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccce060c-c044-45d8-8d3c-92cc9e40198a-utilities\") pod \"certified-operators-m7kmk\" (UID: \"ccce060c-c044-45d8-8d3c-92cc9e40198a\") " pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.614722 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccce060c-c044-45d8-8d3c-92cc9e40198a-catalog-content\") pod \"certified-operators-m7kmk\" (UID: \"ccce060c-c044-45d8-8d3c-92cc9e40198a\") " pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.634648 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p5vh\" (UniqueName: \"kubernetes.io/projected/ccce060c-c044-45d8-8d3c-92cc9e40198a-kube-api-access-9p5vh\") pod \"certified-operators-m7kmk\" (UID: \"ccce060c-c044-45d8-8d3c-92cc9e40198a\") " pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.769397 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:22 crc kubenswrapper[4845]: I1006 06:49:22.943716 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9qgph"] Oct 06 06:49:22 crc kubenswrapper[4845]: W1006 06:49:22.950560 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec5e118e_758c_4120_a070_fe923261cadc.slice/crio-795ae49680d4571095b0549902e2da7bbb4ae68ddbd0e91da7f2c54f4b98812f WatchSource:0}: Error finding container 795ae49680d4571095b0549902e2da7bbb4ae68ddbd0e91da7f2c54f4b98812f: Status 404 returned error can't find the container with id 795ae49680d4571095b0549902e2da7bbb4ae68ddbd0e91da7f2c54f4b98812f Oct 06 06:49:23 crc kubenswrapper[4845]: I1006 06:49:23.047655 4845 generic.go:334] "Generic (PLEG): container finished" podID="c769aa66-5169-4ec2-8993-540bd8bfcfca" containerID="ed05377845e01606c0d7c65e1f271f0aae65f8540976c82dbe1426791553c832" exitCode=0 Oct 06 06:49:23 crc kubenswrapper[4845]: I1006 06:49:23.047760 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctv9g" event={"ID":"c769aa66-5169-4ec2-8993-540bd8bfcfca","Type":"ContainerDied","Data":"ed05377845e01606c0d7c65e1f271f0aae65f8540976c82dbe1426791553c832"} Oct 06 06:49:23 crc kubenswrapper[4845]: I1006 06:49:23.048813 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qgph" event={"ID":"ec5e118e-758c-4120-a070-fe923261cadc","Type":"ContainerStarted","Data":"795ae49680d4571095b0549902e2da7bbb4ae68ddbd0e91da7f2c54f4b98812f"} Oct 06 06:49:23 crc kubenswrapper[4845]: I1006 06:49:23.051855 4845 generic.go:334] "Generic (PLEG): container finished" podID="738fc958-3a60-4780-aa02-8af7f6887aa6" containerID="b81618cf952c440c1db9a4414b0a0e90afd0e51655fb0677841ecd475784dda3" exitCode=0 Oct 06 06:49:23 crc kubenswrapper[4845]: I1006 06:49:23.051891 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-msb4z" event={"ID":"738fc958-3a60-4780-aa02-8af7f6887aa6","Type":"ContainerDied","Data":"b81618cf952c440c1db9a4414b0a0e90afd0e51655fb0677841ecd475784dda3"} Oct 06 06:49:23 crc kubenswrapper[4845]: I1006 06:49:23.180947 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m7kmk"] Oct 06 06:49:23 crc kubenswrapper[4845]: W1006 06:49:23.187295 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccce060c_c044_45d8_8d3c_92cc9e40198a.slice/crio-536ecc78a1416c4450dea3d94e0066b976bb54996aba62dd0186796069aa5612 WatchSource:0}: Error finding container 536ecc78a1416c4450dea3d94e0066b976bb54996aba62dd0186796069aa5612: Status 404 returned error can't find the container with id 536ecc78a1416c4450dea3d94e0066b976bb54996aba62dd0186796069aa5612 Oct 06 06:49:24 crc kubenswrapper[4845]: I1006 06:49:24.060686 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctv9g" event={"ID":"c769aa66-5169-4ec2-8993-540bd8bfcfca","Type":"ContainerStarted","Data":"ae938c5aea297c138cf0ada04cf46758965bae1c8a82a0ec5e412a1aa8a37f3b"} Oct 06 06:49:24 crc kubenswrapper[4845]: I1006 06:49:24.061693 4845 generic.go:334] "Generic (PLEG): container finished" podID="ec5e118e-758c-4120-a070-fe923261cadc" containerID="90f935dabbe4dcc535f3c4219f26b1be7ba31fdf7511a2ee3303647dd8912274" exitCode=0 Oct 06 06:49:24 crc kubenswrapper[4845]: I1006 06:49:24.061747 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qgph" event={"ID":"ec5e118e-758c-4120-a070-fe923261cadc","Type":"ContainerDied","Data":"90f935dabbe4dcc535f3c4219f26b1be7ba31fdf7511a2ee3303647dd8912274"} Oct 06 06:49:24 crc kubenswrapper[4845]: I1006 06:49:24.066028 4845 generic.go:334] "Generic (PLEG): container finished" podID="ccce060c-c044-45d8-8d3c-92cc9e40198a" containerID="c00ba88a7e1afa7d90eabe4b1aefdda2dfa97d478bc84992e1b36da14e51c290" exitCode=0 Oct 06 06:49:24 crc kubenswrapper[4845]: I1006 06:49:24.066101 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7kmk" event={"ID":"ccce060c-c044-45d8-8d3c-92cc9e40198a","Type":"ContainerDied","Data":"c00ba88a7e1afa7d90eabe4b1aefdda2dfa97d478bc84992e1b36da14e51c290"} Oct 06 06:49:24 crc kubenswrapper[4845]: I1006 06:49:24.066129 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7kmk" event={"ID":"ccce060c-c044-45d8-8d3c-92cc9e40198a","Type":"ContainerStarted","Data":"536ecc78a1416c4450dea3d94e0066b976bb54996aba62dd0186796069aa5612"} Oct 06 06:49:24 crc kubenswrapper[4845]: I1006 06:49:24.071356 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-msb4z" event={"ID":"738fc958-3a60-4780-aa02-8af7f6887aa6","Type":"ContainerStarted","Data":"051e323fbbddd13fe0b2fd6831b6190bad2691f849ffce10fabcfd74f39af57f"} Oct 06 06:49:24 crc kubenswrapper[4845]: I1006 06:49:24.085956 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ctv9g" podStartSLOduration=2.6503211970000002 podStartE2EDuration="5.085939729s" podCreationTimestamp="2025-10-06 06:49:19 +0000 UTC" firstStartedPulling="2025-10-06 06:49:21.016499729 +0000 UTC m=+245.531240727" lastFinishedPulling="2025-10-06 06:49:23.452118251 +0000 UTC m=+247.966859259" observedRunningTime="2025-10-06 06:49:24.085382893 +0000 UTC m=+248.600123901" watchObservedRunningTime="2025-10-06 06:49:24.085939729 +0000 UTC m=+248.600680737" Oct 06 06:49:24 crc kubenswrapper[4845]: I1006 06:49:24.110440 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-msb4z" podStartSLOduration=1.602761471 podStartE2EDuration="4.110421637s" podCreationTimestamp="2025-10-06 06:49:20 +0000 UTC" firstStartedPulling="2025-10-06 06:49:21.013527556 +0000 UTC m=+245.528268564" lastFinishedPulling="2025-10-06 06:49:23.521187722 +0000 UTC m=+248.035928730" observedRunningTime="2025-10-06 06:49:24.109007037 +0000 UTC m=+248.623748045" watchObservedRunningTime="2025-10-06 06:49:24.110421637 +0000 UTC m=+248.625162655" Oct 06 06:49:25 crc kubenswrapper[4845]: I1006 06:49:25.077226 4845 generic.go:334] "Generic (PLEG): container finished" podID="ec5e118e-758c-4120-a070-fe923261cadc" containerID="8b646d39621b8712151d3d1c39cb2f9315c04c850eef467649eef2a8aa2bf1be" exitCode=0 Oct 06 06:49:25 crc kubenswrapper[4845]: I1006 06:49:25.077294 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qgph" event={"ID":"ec5e118e-758c-4120-a070-fe923261cadc","Type":"ContainerDied","Data":"8b646d39621b8712151d3d1c39cb2f9315c04c850eef467649eef2a8aa2bf1be"} Oct 06 06:49:25 crc kubenswrapper[4845]: I1006 06:49:25.079169 4845 generic.go:334] "Generic (PLEG): container finished" podID="ccce060c-c044-45d8-8d3c-92cc9e40198a" containerID="8a8eb88c9edd399a01ab14bb7aa66cdbde5345c30ba3a088ffb98d4f9f001937" exitCode=0 Oct 06 06:49:25 crc kubenswrapper[4845]: I1006 06:49:25.079203 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7kmk" event={"ID":"ccce060c-c044-45d8-8d3c-92cc9e40198a","Type":"ContainerDied","Data":"8a8eb88c9edd399a01ab14bb7aa66cdbde5345c30ba3a088ffb98d4f9f001937"} Oct 06 06:49:27 crc kubenswrapper[4845]: I1006 06:49:27.090459 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qgph" event={"ID":"ec5e118e-758c-4120-a070-fe923261cadc","Type":"ContainerStarted","Data":"6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111"} Oct 06 06:49:27 crc kubenswrapper[4845]: I1006 06:49:27.093305 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7kmk" event={"ID":"ccce060c-c044-45d8-8d3c-92cc9e40198a","Type":"ContainerStarted","Data":"b71e52598f842aea591ad53a3f61e51b74a552261a7d9373647dbe59ce11346f"} Oct 06 06:49:27 crc kubenswrapper[4845]: I1006 06:49:27.141809 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m7kmk" podStartSLOduration=3.765436863 podStartE2EDuration="5.141791516s" podCreationTimestamp="2025-10-06 06:49:22 +0000 UTC" firstStartedPulling="2025-10-06 06:49:24.067931343 +0000 UTC m=+248.582672351" lastFinishedPulling="2025-10-06 06:49:25.444285996 +0000 UTC m=+249.959027004" observedRunningTime="2025-10-06 06:49:27.140476689 +0000 UTC m=+251.655217707" watchObservedRunningTime="2025-10-06 06:49:27.141791516 +0000 UTC m=+251.656532524" Oct 06 06:49:27 crc kubenswrapper[4845]: I1006 06:49:27.142317 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9qgph" podStartSLOduration=3.707098363 podStartE2EDuration="5.142313131s" podCreationTimestamp="2025-10-06 06:49:22 +0000 UTC" firstStartedPulling="2025-10-06 06:49:24.063243201 +0000 UTC m=+248.577984209" lastFinishedPulling="2025-10-06 06:49:25.498457969 +0000 UTC m=+250.013198977" observedRunningTime="2025-10-06 06:49:27.124521321 +0000 UTC m=+251.639262359" watchObservedRunningTime="2025-10-06 06:49:27.142313131 +0000 UTC m=+251.657054139" Oct 06 06:49:30 crc kubenswrapper[4845]: I1006 06:49:30.142597 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:30 crc kubenswrapper[4845]: I1006 06:49:30.142901 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:30 crc kubenswrapper[4845]: I1006 06:49:30.186517 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:30 crc kubenswrapper[4845]: I1006 06:49:30.336107 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:30 crc kubenswrapper[4845]: I1006 06:49:30.336456 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:30 crc kubenswrapper[4845]: I1006 06:49:30.371124 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:31 crc kubenswrapper[4845]: I1006 06:49:31.151461 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-msb4z" Oct 06 06:49:31 crc kubenswrapper[4845]: I1006 06:49:31.151824 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ctv9g" Oct 06 06:49:32 crc kubenswrapper[4845]: I1006 06:49:32.563909 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:32 crc kubenswrapper[4845]: I1006 06:49:32.563967 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:32 crc kubenswrapper[4845]: I1006 06:49:32.607438 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:49:32 crc kubenswrapper[4845]: I1006 06:49:32.770143 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:32 crc kubenswrapper[4845]: I1006 06:49:32.770186 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:32 crc kubenswrapper[4845]: I1006 06:49:32.808392 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:33 crc kubenswrapper[4845]: I1006 06:49:33.156289 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m7kmk" Oct 06 06:49:33 crc kubenswrapper[4845]: I1006 06:49:33.156549 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9qgph" Oct 06 06:50:53 crc kubenswrapper[4845]: I1006 06:50:53.019047 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:50:53 crc kubenswrapper[4845]: I1006 06:50:53.019750 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:51:23 crc kubenswrapper[4845]: I1006 06:51:23.019200 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:51:23 crc kubenswrapper[4845]: I1006 06:51:23.019923 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:51:53 crc kubenswrapper[4845]: I1006 06:51:53.018978 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:51:53 crc kubenswrapper[4845]: I1006 06:51:53.020562 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:51:53 crc kubenswrapper[4845]: I1006 06:51:53.020676 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:51:53 crc kubenswrapper[4845]: I1006 06:51:53.021654 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a91a13a44ef01ed072d381b9c9fc9a7b2fdaefa3297b1423272db8e0908ed6fd"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 06:51:53 crc kubenswrapper[4845]: I1006 06:51:53.021736 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://a91a13a44ef01ed072d381b9c9fc9a7b2fdaefa3297b1423272db8e0908ed6fd" gracePeriod=600 Oct 06 06:51:53 crc kubenswrapper[4845]: I1006 06:51:53.869861 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="a91a13a44ef01ed072d381b9c9fc9a7b2fdaefa3297b1423272db8e0908ed6fd" exitCode=0 Oct 06 06:51:53 crc kubenswrapper[4845]: I1006 06:51:53.869977 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"a91a13a44ef01ed072d381b9c9fc9a7b2fdaefa3297b1423272db8e0908ed6fd"} Oct 06 06:51:53 crc kubenswrapper[4845]: I1006 06:51:53.870426 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"79e33f4807773eca1fba9582255263880010ec7bbf733f8b698646f795a161ec"} Oct 06 06:51:53 crc kubenswrapper[4845]: I1006 06:51:53.870450 4845 scope.go:117] "RemoveContainer" containerID="cde94420f9522e96296b7654ca5f759a70e419d2f236bc3e737a0e3e088adfca" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.262412 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k4dcl"] Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.264182 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.320858 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k4dcl"] Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.404595 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.404667 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gmrl\" (UniqueName: \"kubernetes.io/projected/e9ee4338-b1fa-4ef3-8489-c2453c36598f-kube-api-access-6gmrl\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.404710 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9ee4338-b1fa-4ef3-8489-c2453c36598f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.404734 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9ee4338-b1fa-4ef3-8489-c2453c36598f-registry-certificates\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.404765 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ee4338-b1fa-4ef3-8489-c2453c36598f-bound-sa-token\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.404822 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ee4338-b1fa-4ef3-8489-c2453c36598f-registry-tls\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.404844 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9ee4338-b1fa-4ef3-8489-c2453c36598f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.404881 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ee4338-b1fa-4ef3-8489-c2453c36598f-trusted-ca\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.425244 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.505770 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gmrl\" (UniqueName: \"kubernetes.io/projected/e9ee4338-b1fa-4ef3-8489-c2453c36598f-kube-api-access-6gmrl\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.505840 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9ee4338-b1fa-4ef3-8489-c2453c36598f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.505872 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9ee4338-b1fa-4ef3-8489-c2453c36598f-registry-certificates\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.505896 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ee4338-b1fa-4ef3-8489-c2453c36598f-bound-sa-token\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.505919 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ee4338-b1fa-4ef3-8489-c2453c36598f-registry-tls\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.505942 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9ee4338-b1fa-4ef3-8489-c2453c36598f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.505972 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ee4338-b1fa-4ef3-8489-c2453c36598f-trusted-ca\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.506883 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9ee4338-b1fa-4ef3-8489-c2453c36598f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.507449 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9ee4338-b1fa-4ef3-8489-c2453c36598f-registry-certificates\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.507621 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ee4338-b1fa-4ef3-8489-c2453c36598f-trusted-ca\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.511585 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ee4338-b1fa-4ef3-8489-c2453c36598f-registry-tls\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.515844 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9ee4338-b1fa-4ef3-8489-c2453c36598f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.521461 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ee4338-b1fa-4ef3-8489-c2453c36598f-bound-sa-token\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.523400 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gmrl\" (UniqueName: \"kubernetes.io/projected/e9ee4338-b1fa-4ef3-8489-c2453c36598f-kube-api-access-6gmrl\") pod \"image-registry-66df7c8f76-k4dcl\" (UID: \"e9ee4338-b1fa-4ef3-8489-c2453c36598f\") " pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:05 crc kubenswrapper[4845]: I1006 06:52:05.589361 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:06 crc kubenswrapper[4845]: I1006 06:52:06.044352 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-k4dcl"] Oct 06 06:52:06 crc kubenswrapper[4845]: W1006 06:52:06.055955 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9ee4338_b1fa_4ef3_8489_c2453c36598f.slice/crio-7afc968bf31a2cd0d3515706de649d83799f8c4c2a1495fd27c673b1b8bd7dd4 WatchSource:0}: Error finding container 7afc968bf31a2cd0d3515706de649d83799f8c4c2a1495fd27c673b1b8bd7dd4: Status 404 returned error can't find the container with id 7afc968bf31a2cd0d3515706de649d83799f8c4c2a1495fd27c673b1b8bd7dd4 Oct 06 06:52:06 crc kubenswrapper[4845]: I1006 06:52:06.940953 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" event={"ID":"e9ee4338-b1fa-4ef3-8489-c2453c36598f","Type":"ContainerStarted","Data":"e941469614466aa3be3daa6e368009a71725d25312f6710100401f249552fa59"} Oct 06 06:52:06 crc kubenswrapper[4845]: I1006 06:52:06.941009 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" event={"ID":"e9ee4338-b1fa-4ef3-8489-c2453c36598f","Type":"ContainerStarted","Data":"7afc968bf31a2cd0d3515706de649d83799f8c4c2a1495fd27c673b1b8bd7dd4"} Oct 06 06:52:06 crc kubenswrapper[4845]: I1006 06:52:06.941046 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:06 crc kubenswrapper[4845]: I1006 06:52:06.961012 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" podStartSLOduration=1.9609940319999999 podStartE2EDuration="1.960994032s" podCreationTimestamp="2025-10-06 06:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:52:06.957251437 +0000 UTC m=+411.471992465" watchObservedRunningTime="2025-10-06 06:52:06.960994032 +0000 UTC m=+411.475735040" Oct 06 06:52:25 crc kubenswrapper[4845]: I1006 06:52:25.593869 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-k4dcl" Oct 06 06:52:25 crc kubenswrapper[4845]: I1006 06:52:25.636554 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9n6tj"] Oct 06 06:52:50 crc kubenswrapper[4845]: I1006 06:52:50.678288 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" podUID="a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" containerName="registry" containerID="cri-o://a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713" gracePeriod=30 Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.095009 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.159972 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-installation-pull-secrets\") pod \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.160046 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-registry-certificates\") pod \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.160099 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-trusted-ca\") pod \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.160126 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-bound-sa-token\") pod \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.160356 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.160430 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj8nq\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-kube-api-access-cj8nq\") pod \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.160462 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-ca-trust-extracted\") pod \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.160502 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-registry-tls\") pod \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\" (UID: \"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11\") " Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.161521 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.161597 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.173719 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-kube-api-access-cj8nq" (OuterVolumeSpecName: "kube-api-access-cj8nq") pod "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11"). InnerVolumeSpecName "kube-api-access-cj8nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.175498 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.177599 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.177838 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.178310 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.183211 4845 generic.go:334] "Generic (PLEG): container finished" podID="a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" containerID="a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713" exitCode=0 Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.183239 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.183264 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" event={"ID":"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11","Type":"ContainerDied","Data":"a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713"} Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.183359 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9n6tj" event={"ID":"a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11","Type":"ContainerDied","Data":"ada81241639488e2403073aa2c5d497b9e86c82e021fecf4948ac3dfd0d1be49"} Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.183402 4845 scope.go:117] "RemoveContainer" containerID="a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.183451 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" (UID: "a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.211010 4845 scope.go:117] "RemoveContainer" containerID="a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713" Oct 06 06:52:51 crc kubenswrapper[4845]: E1006 06:52:51.211932 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713\": container with ID starting with a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713 not found: ID does not exist" containerID="a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.211965 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713"} err="failed to get container status \"a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713\": rpc error: code = NotFound desc = could not find container \"a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713\": container with ID starting with a529c1b3d2451077b143cb8ee2daf713c0331dee08f31eccdbaccf1775aac713 not found: ID does not exist" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.262077 4845 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.262111 4845 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.262122 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.262133 4845 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.262144 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj8nq\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-kube-api-access-cj8nq\") on node \"crc\" DevicePath \"\"" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.262152 4845 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.262160 4845 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.517250 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9n6tj"] Oct 06 06:52:51 crc kubenswrapper[4845]: I1006 06:52:51.521504 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9n6tj"] Oct 06 06:52:52 crc kubenswrapper[4845]: I1006 06:52:52.234687 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" path="/var/lib/kubelet/pods/a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11/volumes" Oct 06 06:53:53 crc kubenswrapper[4845]: I1006 06:53:53.019621 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:53:53 crc kubenswrapper[4845]: I1006 06:53:53.020430 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.019310 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.019972 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.744219 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-g57wq"] Oct 06 06:54:23 crc kubenswrapper[4845]: E1006 06:54:23.744731 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" containerName="registry" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.744744 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" containerName="registry" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.744847 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8f710e5-2ff8-4c50-8913-ad9b6cf2dd11" containerName="registry" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.745223 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-g57wq" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.748152 4845 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-c4bs6" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.748217 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.748505 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.750470 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wnjt\" (UniqueName: \"kubernetes.io/projected/a2715e14-d1ac-4227-b55e-ad1207b34e92-kube-api-access-4wnjt\") pod \"cert-manager-cainjector-7f985d654d-g57wq\" (UID: \"a2715e14-d1ac-4227-b55e-ad1207b34e92\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-g57wq" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.751834 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-s9n2l"] Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.752508 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-s9n2l" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.754437 4845 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pfrdb" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.759351 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-s9n2l"] Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.780510 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-jtnq6"] Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.781264 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-jtnq6" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.783575 4845 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-nd29d" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.793954 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-jtnq6"] Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.799939 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-g57wq"] Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.851272 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wnjt\" (UniqueName: \"kubernetes.io/projected/a2715e14-d1ac-4227-b55e-ad1207b34e92-kube-api-access-4wnjt\") pod \"cert-manager-cainjector-7f985d654d-g57wq\" (UID: \"a2715e14-d1ac-4227-b55e-ad1207b34e92\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-g57wq" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.873339 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wnjt\" (UniqueName: \"kubernetes.io/projected/a2715e14-d1ac-4227-b55e-ad1207b34e92-kube-api-access-4wnjt\") pod \"cert-manager-cainjector-7f985d654d-g57wq\" (UID: \"a2715e14-d1ac-4227-b55e-ad1207b34e92\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-g57wq" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.952478 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd62s\" (UniqueName: \"kubernetes.io/projected/d6f5fa25-c3ab-47a0-8541-dfdf1ed2d29e-kube-api-access-xd62s\") pod \"cert-manager-5b446d88c5-s9n2l\" (UID: \"d6f5fa25-c3ab-47a0-8541-dfdf1ed2d29e\") " pod="cert-manager/cert-manager-5b446d88c5-s9n2l" Oct 06 06:54:23 crc kubenswrapper[4845]: I1006 06:54:23.952566 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvtfz\" (UniqueName: \"kubernetes.io/projected/f555f420-64f8-46d7-a41a-d24e3257aea5-kube-api-access-zvtfz\") pod \"cert-manager-webhook-5655c58dd6-jtnq6\" (UID: \"f555f420-64f8-46d7-a41a-d24e3257aea5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-jtnq6" Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.053326 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvtfz\" (UniqueName: \"kubernetes.io/projected/f555f420-64f8-46d7-a41a-d24e3257aea5-kube-api-access-zvtfz\") pod \"cert-manager-webhook-5655c58dd6-jtnq6\" (UID: \"f555f420-64f8-46d7-a41a-d24e3257aea5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-jtnq6" Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.053471 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd62s\" (UniqueName: \"kubernetes.io/projected/d6f5fa25-c3ab-47a0-8541-dfdf1ed2d29e-kube-api-access-xd62s\") pod \"cert-manager-5b446d88c5-s9n2l\" (UID: \"d6f5fa25-c3ab-47a0-8541-dfdf1ed2d29e\") " pod="cert-manager/cert-manager-5b446d88c5-s9n2l" Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.067269 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-g57wq" Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.078022 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvtfz\" (UniqueName: \"kubernetes.io/projected/f555f420-64f8-46d7-a41a-d24e3257aea5-kube-api-access-zvtfz\") pod \"cert-manager-webhook-5655c58dd6-jtnq6\" (UID: \"f555f420-64f8-46d7-a41a-d24e3257aea5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-jtnq6" Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.078952 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd62s\" (UniqueName: \"kubernetes.io/projected/d6f5fa25-c3ab-47a0-8541-dfdf1ed2d29e-kube-api-access-xd62s\") pod \"cert-manager-5b446d88c5-s9n2l\" (UID: \"d6f5fa25-c3ab-47a0-8541-dfdf1ed2d29e\") " pod="cert-manager/cert-manager-5b446d88c5-s9n2l" Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.097668 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-jtnq6" Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.270769 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-g57wq"] Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.278966 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.328463 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-jtnq6"] Oct 06 06:54:24 crc kubenswrapper[4845]: W1006 06:54:24.333412 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf555f420_64f8_46d7_a41a_d24e3257aea5.slice/crio-66de0a138e672d3d677349cfd333802216cd13d55d11c16b4d4a38b3f0af69ef WatchSource:0}: Error finding container 66de0a138e672d3d677349cfd333802216cd13d55d11c16b4d4a38b3f0af69ef: Status 404 returned error can't find the container with id 66de0a138e672d3d677349cfd333802216cd13d55d11c16b4d4a38b3f0af69ef Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.376204 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-s9n2l" Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.539273 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-s9n2l"] Oct 06 06:54:24 crc kubenswrapper[4845]: W1006 06:54:24.544691 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6f5fa25_c3ab_47a0_8541_dfdf1ed2d29e.slice/crio-984e5cb203678a01430b2436132ba6fc28ccb377a66e2110cd53a55409f4096b WatchSource:0}: Error finding container 984e5cb203678a01430b2436132ba6fc28ccb377a66e2110cd53a55409f4096b: Status 404 returned error can't find the container with id 984e5cb203678a01430b2436132ba6fc28ccb377a66e2110cd53a55409f4096b Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.735663 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-g57wq" event={"ID":"a2715e14-d1ac-4227-b55e-ad1207b34e92","Type":"ContainerStarted","Data":"4d449b3e8a538a07bd989f5b612c97bf7c0fb99bf4012b4a195a85fb27197564"} Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.736861 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-jtnq6" event={"ID":"f555f420-64f8-46d7-a41a-d24e3257aea5","Type":"ContainerStarted","Data":"66de0a138e672d3d677349cfd333802216cd13d55d11c16b4d4a38b3f0af69ef"} Oct 06 06:54:24 crc kubenswrapper[4845]: I1006 06:54:24.738538 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-s9n2l" event={"ID":"d6f5fa25-c3ab-47a0-8541-dfdf1ed2d29e","Type":"ContainerStarted","Data":"984e5cb203678a01430b2436132ba6fc28ccb377a66e2110cd53a55409f4096b"} Oct 06 06:54:27 crc kubenswrapper[4845]: I1006 06:54:27.760783 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-g57wq" event={"ID":"a2715e14-d1ac-4227-b55e-ad1207b34e92","Type":"ContainerStarted","Data":"cafc9c4eab0a9e90a093f0270787fca21d85e2547eba8a1e65564386b485d2d0"} Oct 06 06:54:27 crc kubenswrapper[4845]: I1006 06:54:27.762490 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-jtnq6" event={"ID":"f555f420-64f8-46d7-a41a-d24e3257aea5","Type":"ContainerStarted","Data":"fdbc13540c3f554e88bf492afda14113672fdae3b4068a4dab7fd20603790251"} Oct 06 06:54:27 crc kubenswrapper[4845]: I1006 06:54:27.762637 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-jtnq6" Oct 06 06:54:27 crc kubenswrapper[4845]: I1006 06:54:27.763900 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-s9n2l" event={"ID":"d6f5fa25-c3ab-47a0-8541-dfdf1ed2d29e","Type":"ContainerStarted","Data":"82bf089486327963f181d24ade6ea228d992f8e1ae8b0caf4ff1a5da72bf2006"} Oct 06 06:54:27 crc kubenswrapper[4845]: I1006 06:54:27.774940 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-g57wq" podStartSLOduration=1.770507588 podStartE2EDuration="4.774923824s" podCreationTimestamp="2025-10-06 06:54:23 +0000 UTC" firstStartedPulling="2025-10-06 06:54:24.278586308 +0000 UTC m=+548.793327316" lastFinishedPulling="2025-10-06 06:54:27.283002544 +0000 UTC m=+551.797743552" observedRunningTime="2025-10-06 06:54:27.77396041 +0000 UTC m=+552.288701418" watchObservedRunningTime="2025-10-06 06:54:27.774923824 +0000 UTC m=+552.289664832" Oct 06 06:54:27 crc kubenswrapper[4845]: I1006 06:54:27.802511 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-s9n2l" podStartSLOduration=1.9269157460000002 podStartE2EDuration="4.802493977s" podCreationTimestamp="2025-10-06 06:54:23 +0000 UTC" firstStartedPulling="2025-10-06 06:54:24.546762105 +0000 UTC m=+549.061503113" lastFinishedPulling="2025-10-06 06:54:27.422340336 +0000 UTC m=+551.937081344" observedRunningTime="2025-10-06 06:54:27.801754109 +0000 UTC m=+552.316495117" watchObservedRunningTime="2025-10-06 06:54:27.802493977 +0000 UTC m=+552.317234985" Oct 06 06:54:27 crc kubenswrapper[4845]: I1006 06:54:27.802799 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-jtnq6" podStartSLOduration=1.793433257 podStartE2EDuration="4.802794264s" podCreationTimestamp="2025-10-06 06:54:23 +0000 UTC" firstStartedPulling="2025-10-06 06:54:24.335146299 +0000 UTC m=+548.849887307" lastFinishedPulling="2025-10-06 06:54:27.344507306 +0000 UTC m=+551.859248314" observedRunningTime="2025-10-06 06:54:27.792122234 +0000 UTC m=+552.306863262" watchObservedRunningTime="2025-10-06 06:54:27.802794264 +0000 UTC m=+552.317535272" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.102208 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-jtnq6" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.490122 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-587xc"] Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.490815 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovn-controller" containerID="cri-o://e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413" gracePeriod=30 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.490905 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="nbdb" containerID="cri-o://4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52" gracePeriod=30 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.491007 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovn-acl-logging" containerID="cri-o://8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e" gracePeriod=30 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.491032 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="kube-rbac-proxy-node" containerID="cri-o://04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0" gracePeriod=30 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.491098 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6" gracePeriod=30 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.491093 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="northd" containerID="cri-o://0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429" gracePeriod=30 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.491596 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="sbdb" containerID="cri-o://14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d" gracePeriod=30 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.549550 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" containerID="cri-o://5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177" gracePeriod=30 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.806687 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/3.log" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.808687 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovn-acl-logging/0.log" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.809292 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovn-controller/0.log" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.809704 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.823179 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zpn9l_2080026c-9eee-4863-b62d-e9ce4d4525dd/kube-multus/2.log" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.823859 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zpn9l_2080026c-9eee-4863-b62d-e9ce4d4525dd/kube-multus/1.log" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.823899 4845 generic.go:334] "Generic (PLEG): container finished" podID="2080026c-9eee-4863-b62d-e9ce4d4525dd" containerID="cce252b8a800605895c878ef155f99e6ad73ee5ff140006ed993f2d0d046df30" exitCode=2 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.823954 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zpn9l" event={"ID":"2080026c-9eee-4863-b62d-e9ce4d4525dd","Type":"ContainerDied","Data":"cce252b8a800605895c878ef155f99e6ad73ee5ff140006ed993f2d0d046df30"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.823992 4845 scope.go:117] "RemoveContainer" containerID="a5fb957ec713b15c373dd40160d79fa6038407e541c2603db92c1fa4f6e96959" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.824542 4845 scope.go:117] "RemoveContainer" containerID="cce252b8a800605895c878ef155f99e6ad73ee5ff140006ed993f2d0d046df30" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.824819 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zpn9l_openshift-multus(2080026c-9eee-4863-b62d-e9ce4d4525dd)\"" pod="openshift-multus/multus-zpn9l" podUID="2080026c-9eee-4863-b62d-e9ce4d4525dd" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.829542 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovnkube-controller/3.log" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.832195 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovn-acl-logging/0.log" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833053 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-587xc_58772108-964d-4d0c-90a4-70ad5fe1da2d/ovn-controller/0.log" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833709 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177" exitCode=0 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833737 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d" exitCode=0 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833879 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52" exitCode=0 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833891 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429" exitCode=0 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833901 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6" exitCode=0 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833909 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0" exitCode=0 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833901 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833959 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833979 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833993 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834011 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834028 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834042 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834057 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834065 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834075 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834083 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834091 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834098 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834106 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834114 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834122 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834132 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834143 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834153 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834162 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834171 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834182 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834192 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834202 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834211 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834219 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.834228 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833917 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.833918 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e" exitCode=143 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835193 4845 generic.go:334] "Generic (PLEG): container finished" podID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerID="e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413" exitCode=143 Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835219 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835265 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835277 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835285 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835292 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835300 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835332 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835340 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835349 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835358 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835365 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835399 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-587xc" event={"ID":"58772108-964d-4d0c-90a4-70ad5fe1da2d","Type":"ContainerDied","Data":"1edac2b123f605cafe95364a4698dd6563a25c0d8c5d7af759cef5ee95439e65"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835413 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835421 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835428 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835435 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835443 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835451 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835459 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835468 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835476 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.835483 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084"} Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.866513 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fn9ps"] Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.866784 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.866796 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.866808 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.866814 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.866820 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="nbdb" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.866825 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="nbdb" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.866832 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.866838 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.866848 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovn-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.866854 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovn-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.866864 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="northd" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.866869 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="northd" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.866880 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.866886 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.866893 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="kube-rbac-proxy-node" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.866898 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="kube-rbac-proxy-node" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.866905 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovn-acl-logging" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.866910 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovn-acl-logging" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.866919 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="kubecfg-setup" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.866925 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="kubecfg-setup" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.866933 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="sbdb" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.866940 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="sbdb" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867041 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="northd" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867050 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867056 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovn-acl-logging" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867063 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="sbdb" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867071 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="nbdb" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867081 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovn-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867089 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="kube-rbac-proxy-node" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867098 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867104 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867112 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867119 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.867206 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867213 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: E1006 06:54:34.867220 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867225 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.867304 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" containerName="ovnkube-controller" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.869151 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.873547 4845 scope.go:117] "RemoveContainer" containerID="5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.886264 4845 scope.go:117] "RemoveContainer" containerID="87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.907758 4845 scope.go:117] "RemoveContainer" containerID="14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.918228 4845 scope.go:117] "RemoveContainer" containerID="4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.928289 4845 scope.go:117] "RemoveContainer" containerID="0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.949537 4845 scope.go:117] "RemoveContainer" containerID="17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.959918 4845 scope.go:117] "RemoveContainer" containerID="04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.971423 4845 scope.go:117] "RemoveContainer" containerID="8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.981420 4845 scope.go:117] "RemoveContainer" containerID="e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987257 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-cni-netd\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987297 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-openvswitch\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987332 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-log-socket\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987344 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987387 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987496 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-run-ovn-kubernetes\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987517 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-ovn\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987539 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-systemd\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987525 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-log-socket" (OuterVolumeSpecName: "log-socket") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987562 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-slash\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987582 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987586 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987586 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovnkube-config\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987655 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zd8j\" (UniqueName: \"kubernetes.io/projected/58772108-964d-4d0c-90a4-70ad5fe1da2d-kube-api-access-9zd8j\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987683 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-var-lib-openvswitch\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987706 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-run-netns\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987733 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.987909 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-env-overrides\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988011 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988042 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-slash" (OuterVolumeSpecName: "host-slash") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988077 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988405 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-node-log\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988454 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988485 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-systemd-units\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988486 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988536 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-kubelet\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988534 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-node-log" (OuterVolumeSpecName: "node-log") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988577 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovnkube-script-lib\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988558 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988627 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovn-node-metrics-cert\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988625 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988564 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988648 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-etc-openvswitch\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988693 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-cni-bin\") pod \"58772108-964d-4d0c-90a4-70ad5fe1da2d\" (UID: \"58772108-964d-4d0c-90a4-70ad5fe1da2d\") " Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988766 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988818 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988885 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-var-lib-openvswitch\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988931 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-etc-openvswitch\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988988 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.988995 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-systemd-units\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989032 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-node-log\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989074 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-run-openvswitch\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989093 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-slash\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989112 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-cni-netd\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989151 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-run-ovn-kubernetes\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989189 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989229 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-cni-bin\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989247 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-run-ovn\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989273 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-ovnkube-script-lib\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989318 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-log-socket\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989340 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-kubelet\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989363 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-ovn-node-metrics-cert\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989395 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw6qj\" (UniqueName: \"kubernetes.io/projected/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-kube-api-access-gw6qj\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989458 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-run-netns\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989501 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-run-systemd\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989530 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-ovnkube-config\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989552 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-env-overrides\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989629 4845 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989647 4845 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989660 4845 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-slash\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989673 4845 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989683 4845 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989694 4845 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989704 4845 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989714 4845 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-node-log\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989726 4845 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989740 4845 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989752 4845 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989765 4845 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989778 4845 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989792 4845 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989804 4845 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989816 4845 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.989827 4845 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-log-socket\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.993133 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58772108-964d-4d0c-90a4-70ad5fe1da2d-kube-api-access-9zd8j" (OuterVolumeSpecName: "kube-api-access-9zd8j") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "kube-api-access-9zd8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.993477 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:54:34 crc kubenswrapper[4845]: I1006 06:54:34.994020 4845 scope.go:117] "RemoveContainer" containerID="5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.001722 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "58772108-964d-4d0c-90a4-70ad5fe1da2d" (UID: "58772108-964d-4d0c-90a4-70ad5fe1da2d"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.007878 4845 scope.go:117] "RemoveContainer" containerID="5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177" Oct 06 06:54:35 crc kubenswrapper[4845]: E1006 06:54:35.008172 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177\": container with ID starting with 5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177 not found: ID does not exist" containerID="5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.008204 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177"} err="failed to get container status \"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177\": rpc error: code = NotFound desc = could not find container \"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177\": container with ID starting with 5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.008225 4845 scope.go:117] "RemoveContainer" containerID="87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce" Oct 06 06:54:35 crc kubenswrapper[4845]: E1006 06:54:35.008562 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\": container with ID starting with 87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce not found: ID does not exist" containerID="87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.008627 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce"} err="failed to get container status \"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\": rpc error: code = NotFound desc = could not find container \"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\": container with ID starting with 87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.008671 4845 scope.go:117] "RemoveContainer" containerID="14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d" Oct 06 06:54:35 crc kubenswrapper[4845]: E1006 06:54:35.009009 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\": container with ID starting with 14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d not found: ID does not exist" containerID="14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.009063 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d"} err="failed to get container status \"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\": rpc error: code = NotFound desc = could not find container \"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\": container with ID starting with 14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.009100 4845 scope.go:117] "RemoveContainer" containerID="4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52" Oct 06 06:54:35 crc kubenswrapper[4845]: E1006 06:54:35.009394 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\": container with ID starting with 4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52 not found: ID does not exist" containerID="4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.009420 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52"} err="failed to get container status \"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\": rpc error: code = NotFound desc = could not find container \"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\": container with ID starting with 4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.009435 4845 scope.go:117] "RemoveContainer" containerID="0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429" Oct 06 06:54:35 crc kubenswrapper[4845]: E1006 06:54:35.009758 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\": container with ID starting with 0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429 not found: ID does not exist" containerID="0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.009778 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429"} err="failed to get container status \"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\": rpc error: code = NotFound desc = could not find container \"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\": container with ID starting with 0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.009791 4845 scope.go:117] "RemoveContainer" containerID="17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6" Oct 06 06:54:35 crc kubenswrapper[4845]: E1006 06:54:35.010140 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\": container with ID starting with 17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6 not found: ID does not exist" containerID="17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.010186 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6"} err="failed to get container status \"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\": rpc error: code = NotFound desc = could not find container \"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\": container with ID starting with 17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.010214 4845 scope.go:117] "RemoveContainer" containerID="04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0" Oct 06 06:54:35 crc kubenswrapper[4845]: E1006 06:54:35.010683 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\": container with ID starting with 04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0 not found: ID does not exist" containerID="04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.010721 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0"} err="failed to get container status \"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\": rpc error: code = NotFound desc = could not find container \"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\": container with ID starting with 04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.010743 4845 scope.go:117] "RemoveContainer" containerID="8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e" Oct 06 06:54:35 crc kubenswrapper[4845]: E1006 06:54:35.011017 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\": container with ID starting with 8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e not found: ID does not exist" containerID="8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.011063 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e"} err="failed to get container status \"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\": rpc error: code = NotFound desc = could not find container \"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\": container with ID starting with 8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.011092 4845 scope.go:117] "RemoveContainer" containerID="e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413" Oct 06 06:54:35 crc kubenswrapper[4845]: E1006 06:54:35.011533 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\": container with ID starting with e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413 not found: ID does not exist" containerID="e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.011564 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413"} err="failed to get container status \"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\": rpc error: code = NotFound desc = could not find container \"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\": container with ID starting with e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.011584 4845 scope.go:117] "RemoveContainer" containerID="5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084" Oct 06 06:54:35 crc kubenswrapper[4845]: E1006 06:54:35.011872 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\": container with ID starting with 5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084 not found: ID does not exist" containerID="5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.011921 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084"} err="failed to get container status \"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\": rpc error: code = NotFound desc = could not find container \"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\": container with ID starting with 5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.011951 4845 scope.go:117] "RemoveContainer" containerID="5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.012353 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177"} err="failed to get container status \"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177\": rpc error: code = NotFound desc = could not find container \"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177\": container with ID starting with 5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.012441 4845 scope.go:117] "RemoveContainer" containerID="87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.012768 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce"} err="failed to get container status \"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\": rpc error: code = NotFound desc = could not find container \"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\": container with ID starting with 87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.012808 4845 scope.go:117] "RemoveContainer" containerID="14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.013115 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d"} err="failed to get container status \"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\": rpc error: code = NotFound desc = could not find container \"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\": container with ID starting with 14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.013140 4845 scope.go:117] "RemoveContainer" containerID="4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.013423 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52"} err="failed to get container status \"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\": rpc error: code = NotFound desc = could not find container \"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\": container with ID starting with 4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.013462 4845 scope.go:117] "RemoveContainer" containerID="0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.013707 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429"} err="failed to get container status \"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\": rpc error: code = NotFound desc = could not find container \"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\": container with ID starting with 0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.013737 4845 scope.go:117] "RemoveContainer" containerID="17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.014069 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6"} err="failed to get container status \"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\": rpc error: code = NotFound desc = could not find container \"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\": container with ID starting with 17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.014089 4845 scope.go:117] "RemoveContainer" containerID="04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.014460 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0"} err="failed to get container status \"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\": rpc error: code = NotFound desc = could not find container \"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\": container with ID starting with 04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.014490 4845 scope.go:117] "RemoveContainer" containerID="8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.014807 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e"} err="failed to get container status \"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\": rpc error: code = NotFound desc = could not find container \"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\": container with ID starting with 8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.014835 4845 scope.go:117] "RemoveContainer" containerID="e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.015130 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413"} err="failed to get container status \"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\": rpc error: code = NotFound desc = could not find container \"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\": container with ID starting with e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.015174 4845 scope.go:117] "RemoveContainer" containerID="5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.015458 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084"} err="failed to get container status \"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\": rpc error: code = NotFound desc = could not find container \"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\": container with ID starting with 5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.015479 4845 scope.go:117] "RemoveContainer" containerID="5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.015785 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177"} err="failed to get container status \"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177\": rpc error: code = NotFound desc = could not find container \"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177\": container with ID starting with 5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.015809 4845 scope.go:117] "RemoveContainer" containerID="87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.016210 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce"} err="failed to get container status \"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\": rpc error: code = NotFound desc = could not find container \"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\": container with ID starting with 87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.016245 4845 scope.go:117] "RemoveContainer" containerID="14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.016570 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d"} err="failed to get container status \"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\": rpc error: code = NotFound desc = could not find container \"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\": container with ID starting with 14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.016616 4845 scope.go:117] "RemoveContainer" containerID="4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.017042 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52"} err="failed to get container status \"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\": rpc error: code = NotFound desc = could not find container \"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\": container with ID starting with 4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.017070 4845 scope.go:117] "RemoveContainer" containerID="0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.017354 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429"} err="failed to get container status \"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\": rpc error: code = NotFound desc = could not find container \"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\": container with ID starting with 0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.017416 4845 scope.go:117] "RemoveContainer" containerID="17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.017703 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6"} err="failed to get container status \"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\": rpc error: code = NotFound desc = could not find container \"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\": container with ID starting with 17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.017735 4845 scope.go:117] "RemoveContainer" containerID="04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.018134 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0"} err="failed to get container status \"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\": rpc error: code = NotFound desc = could not find container \"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\": container with ID starting with 04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.018168 4845 scope.go:117] "RemoveContainer" containerID="8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.018449 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e"} err="failed to get container status \"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\": rpc error: code = NotFound desc = could not find container \"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\": container with ID starting with 8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.018490 4845 scope.go:117] "RemoveContainer" containerID="e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.018763 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413"} err="failed to get container status \"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\": rpc error: code = NotFound desc = could not find container \"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\": container with ID starting with e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.018785 4845 scope.go:117] "RemoveContainer" containerID="5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.018984 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084"} err="failed to get container status \"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\": rpc error: code = NotFound desc = could not find container \"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\": container with ID starting with 5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.019045 4845 scope.go:117] "RemoveContainer" containerID="5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.019236 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177"} err="failed to get container status \"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177\": rpc error: code = NotFound desc = could not find container \"5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177\": container with ID starting with 5edfe3e22697dc33c653ef54801db1b48bd5d9c8d7b4ed32b62b12d93f4ec177 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.019262 4845 scope.go:117] "RemoveContainer" containerID="87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.019562 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce"} err="failed to get container status \"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\": rpc error: code = NotFound desc = could not find container \"87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce\": container with ID starting with 87f66a59ed527c47a9218c23694b63d5ba270addc2fc35bdac92200243cea3ce not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.019580 4845 scope.go:117] "RemoveContainer" containerID="14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.019800 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d"} err="failed to get container status \"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\": rpc error: code = NotFound desc = could not find container \"14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d\": container with ID starting with 14c5cb4cb8da077510e3253deb3822966a4c16367f71da01c263e0f67a1d5d8d not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.019834 4845 scope.go:117] "RemoveContainer" containerID="4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.020112 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52"} err="failed to get container status \"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\": rpc error: code = NotFound desc = could not find container \"4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52\": container with ID starting with 4590b908dcd83cba2c58d2f7fac15ec65b2dcf1a7acf4c7b14558774b736be52 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.020131 4845 scope.go:117] "RemoveContainer" containerID="0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.020314 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429"} err="failed to get container status \"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\": rpc error: code = NotFound desc = could not find container \"0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429\": container with ID starting with 0ec9a30c75988a110f2a2a62c60099859908adbea7b19116426dfd30edd11429 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.020336 4845 scope.go:117] "RemoveContainer" containerID="17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.020586 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6"} err="failed to get container status \"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\": rpc error: code = NotFound desc = could not find container \"17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6\": container with ID starting with 17a18fbeb9a8fbc61531a14ff49f80fa0957bfae37f38045244b3411424f4df6 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.020604 4845 scope.go:117] "RemoveContainer" containerID="04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.020803 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0"} err="failed to get container status \"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\": rpc error: code = NotFound desc = could not find container \"04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0\": container with ID starting with 04ce946372e37c6c4c7dc37068df04df0816e583e226b00eaffddedace2287e0 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.020818 4845 scope.go:117] "RemoveContainer" containerID="8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.021107 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e"} err="failed to get container status \"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\": rpc error: code = NotFound desc = could not find container \"8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e\": container with ID starting with 8e88d20cc0c61a936b2854f77d4478464d2005abd68deb27ab0c13964213eb7e not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.021158 4845 scope.go:117] "RemoveContainer" containerID="e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.021436 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413"} err="failed to get container status \"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\": rpc error: code = NotFound desc = could not find container \"e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413\": container with ID starting with e8423bdc01b54aee6f6726a538988de94b06bd6948bd4060587ac05c4f8e4413 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.021471 4845 scope.go:117] "RemoveContainer" containerID="5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.021724 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084"} err="failed to get container status \"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\": rpc error: code = NotFound desc = could not find container \"5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084\": container with ID starting with 5fb5ab89a1a3cbdb875e4ee0d308d562968f43dd86515e049141287b975f1084 not found: ID does not exist" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.091734 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.091781 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.091857 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-cni-bin\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.091890 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-run-ovn\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.091929 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-ovnkube-script-lib\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.091962 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-log-socket\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.091967 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-cni-bin\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.091992 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-kubelet\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092026 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-ovn-node-metrics-cert\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092037 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-log-socket\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092044 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-run-ovn\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092071 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-kubelet\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092060 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw6qj\" (UniqueName: \"kubernetes.io/projected/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-kube-api-access-gw6qj\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092164 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-run-netns\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092210 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-run-systemd\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092248 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-ovnkube-config\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092273 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-env-overrides\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092319 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-var-lib-openvswitch\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092353 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-etc-openvswitch\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092453 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-systemd-units\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092451 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-run-netns\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092498 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-var-lib-openvswitch\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092517 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-node-log\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092486 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-node-log\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092556 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-etc-openvswitch\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092589 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-run-openvswitch\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092604 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-systemd-units\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092618 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-slash\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092641 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-slash\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092654 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-run-openvswitch\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092661 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-run-systemd\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092717 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-cni-netd\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092680 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-cni-netd\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092796 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-run-ovn-kubernetes\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092920 4845 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/58772108-964d-4d0c-90a4-70ad5fe1da2d-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092939 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-host-run-ovn-kubernetes\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.092943 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zd8j\" (UniqueName: \"kubernetes.io/projected/58772108-964d-4d0c-90a4-70ad5fe1da2d-kube-api-access-9zd8j\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.093008 4845 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/58772108-964d-4d0c-90a4-70ad5fe1da2d-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.093931 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-env-overrides\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.094838 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-ovnkube-script-lib\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.095020 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-ovn-node-metrics-cert\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.095400 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-ovnkube-config\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.106112 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw6qj\" (UniqueName: \"kubernetes.io/projected/da5ca180-bcd2-4d21-bdbc-5d1d03762a90-kube-api-access-gw6qj\") pod \"ovnkube-node-fn9ps\" (UID: \"da5ca180-bcd2-4d21-bdbc-5d1d03762a90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.176622 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-587xc"] Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.179827 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-587xc"] Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.192222 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.841511 4845 generic.go:334] "Generic (PLEG): container finished" podID="da5ca180-bcd2-4d21-bdbc-5d1d03762a90" containerID="52053a3cf3e970729580176c01f70fd9198c3704002a72134d11bc8a9ad96bf7" exitCode=0 Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.841610 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" event={"ID":"da5ca180-bcd2-4d21-bdbc-5d1d03762a90","Type":"ContainerDied","Data":"52053a3cf3e970729580176c01f70fd9198c3704002a72134d11bc8a9ad96bf7"} Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.841722 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" event={"ID":"da5ca180-bcd2-4d21-bdbc-5d1d03762a90","Type":"ContainerStarted","Data":"86db166601e34754a3c25334b2e1676e1dcf7eb15afa1019d75a21c9072e573a"} Oct 06 06:54:35 crc kubenswrapper[4845]: I1006 06:54:35.847083 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zpn9l_2080026c-9eee-4863-b62d-e9ce4d4525dd/kube-multus/2.log" Oct 06 06:54:36 crc kubenswrapper[4845]: I1006 06:54:36.237947 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58772108-964d-4d0c-90a4-70ad5fe1da2d" path="/var/lib/kubelet/pods/58772108-964d-4d0c-90a4-70ad5fe1da2d/volumes" Oct 06 06:54:36 crc kubenswrapper[4845]: I1006 06:54:36.857052 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" event={"ID":"da5ca180-bcd2-4d21-bdbc-5d1d03762a90","Type":"ContainerStarted","Data":"b741bf5f0ab8df7711f1b79eeef74e8ec491dc689e2bbf8957dfb942f1c10c53"} Oct 06 06:54:36 crc kubenswrapper[4845]: I1006 06:54:36.857105 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" event={"ID":"da5ca180-bcd2-4d21-bdbc-5d1d03762a90","Type":"ContainerStarted","Data":"776c3161f8adb43320b71196726c1c12bf235d841a6510b30fff639769fa168b"} Oct 06 06:54:36 crc kubenswrapper[4845]: I1006 06:54:36.857121 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" event={"ID":"da5ca180-bcd2-4d21-bdbc-5d1d03762a90","Type":"ContainerStarted","Data":"634f5d236042027541823bb9dcbc16232c734095728ec27cd7730d0dee72423c"} Oct 06 06:54:36 crc kubenswrapper[4845]: I1006 06:54:36.857153 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" event={"ID":"da5ca180-bcd2-4d21-bdbc-5d1d03762a90","Type":"ContainerStarted","Data":"bf26ac05d900e0c3965565239d55e17eccf87cb6b3f03487b7d98a81cc366487"} Oct 06 06:54:36 crc kubenswrapper[4845]: I1006 06:54:36.857165 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" event={"ID":"da5ca180-bcd2-4d21-bdbc-5d1d03762a90","Type":"ContainerStarted","Data":"91531948959d40cccf4ddf1b1e9a01de0cfdf44d25b7beaf6a25d62b2f65e750"} Oct 06 06:54:36 crc kubenswrapper[4845]: I1006 06:54:36.857177 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" event={"ID":"da5ca180-bcd2-4d21-bdbc-5d1d03762a90","Type":"ContainerStarted","Data":"99e88ed80c0ee6c1585841f167c0adc8cd76593e77498693e3a9269ef2a61594"} Oct 06 06:54:38 crc kubenswrapper[4845]: I1006 06:54:38.887715 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" event={"ID":"da5ca180-bcd2-4d21-bdbc-5d1d03762a90","Type":"ContainerStarted","Data":"56ba5a7efa8160d600c9e85c37bbf9447a19b85f6416033ad407f33b847f5a81"} Oct 06 06:54:41 crc kubenswrapper[4845]: I1006 06:54:41.909300 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" event={"ID":"da5ca180-bcd2-4d21-bdbc-5d1d03762a90","Type":"ContainerStarted","Data":"d30bcebb29a994ce0686effca7fe537b93fa01ce5b6485755be615dfeb19fcae"} Oct 06 06:54:41 crc kubenswrapper[4845]: I1006 06:54:41.910246 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:41 crc kubenswrapper[4845]: I1006 06:54:41.910264 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:41 crc kubenswrapper[4845]: I1006 06:54:41.910274 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:41 crc kubenswrapper[4845]: I1006 06:54:41.963966 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:41 crc kubenswrapper[4845]: I1006 06:54:41.966085 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:54:41 crc kubenswrapper[4845]: I1006 06:54:41.996163 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" podStartSLOduration=7.996141123 podStartE2EDuration="7.996141123s" podCreationTimestamp="2025-10-06 06:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:54:41.954121917 +0000 UTC m=+566.468862935" watchObservedRunningTime="2025-10-06 06:54:41.996141123 +0000 UTC m=+566.510882151" Oct 06 06:54:47 crc kubenswrapper[4845]: I1006 06:54:47.227274 4845 scope.go:117] "RemoveContainer" containerID="cce252b8a800605895c878ef155f99e6ad73ee5ff140006ed993f2d0d046df30" Oct 06 06:54:47 crc kubenswrapper[4845]: E1006 06:54:47.228980 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zpn9l_openshift-multus(2080026c-9eee-4863-b62d-e9ce4d4525dd)\"" pod="openshift-multus/multus-zpn9l" podUID="2080026c-9eee-4863-b62d-e9ce4d4525dd" Oct 06 06:54:53 crc kubenswrapper[4845]: I1006 06:54:53.018990 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:54:53 crc kubenswrapper[4845]: I1006 06:54:53.019062 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:54:53 crc kubenswrapper[4845]: I1006 06:54:53.019105 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:54:53 crc kubenswrapper[4845]: I1006 06:54:53.019699 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"79e33f4807773eca1fba9582255263880010ec7bbf733f8b698646f795a161ec"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 06:54:53 crc kubenswrapper[4845]: I1006 06:54:53.019764 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://79e33f4807773eca1fba9582255263880010ec7bbf733f8b698646f795a161ec" gracePeriod=600 Oct 06 06:54:53 crc kubenswrapper[4845]: I1006 06:54:53.994639 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="79e33f4807773eca1fba9582255263880010ec7bbf733f8b698646f795a161ec" exitCode=0 Oct 06 06:54:53 crc kubenswrapper[4845]: I1006 06:54:53.994745 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"79e33f4807773eca1fba9582255263880010ec7bbf733f8b698646f795a161ec"} Oct 06 06:54:53 crc kubenswrapper[4845]: I1006 06:54:53.995031 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"443ce9eb4600adb1d78ff49056676775ceac0472627215eb275bc31a89a7c3b8"} Oct 06 06:54:53 crc kubenswrapper[4845]: I1006 06:54:53.995064 4845 scope.go:117] "RemoveContainer" containerID="a91a13a44ef01ed072d381b9c9fc9a7b2fdaefa3297b1423272db8e0908ed6fd" Oct 06 06:54:58 crc kubenswrapper[4845]: I1006 06:54:58.227081 4845 scope.go:117] "RemoveContainer" containerID="cce252b8a800605895c878ef155f99e6ad73ee5ff140006ed993f2d0d046df30" Oct 06 06:54:59 crc kubenswrapper[4845]: I1006 06:54:59.021419 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zpn9l_2080026c-9eee-4863-b62d-e9ce4d4525dd/kube-multus/2.log" Oct 06 06:54:59 crc kubenswrapper[4845]: I1006 06:54:59.022045 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zpn9l" event={"ID":"2080026c-9eee-4863-b62d-e9ce4d4525dd","Type":"ContainerStarted","Data":"7895e22af1dcb4ab5cff862efedde049e24d5f66ed4b84ea38193508945bddd5"} Oct 06 06:55:05 crc kubenswrapper[4845]: I1006 06:55:05.224706 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fn9ps" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.264231 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm"] Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.265729 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.267673 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.273564 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm"] Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.405913 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e70f5fe4-9f73-4919-87c7-2732d365bdd0-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm\" (UID: \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.406079 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-927v8\" (UniqueName: \"kubernetes.io/projected/e70f5fe4-9f73-4919-87c7-2732d365bdd0-kube-api-access-927v8\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm\" (UID: \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.406109 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e70f5fe4-9f73-4919-87c7-2732d365bdd0-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm\" (UID: \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.506860 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-927v8\" (UniqueName: \"kubernetes.io/projected/e70f5fe4-9f73-4919-87c7-2732d365bdd0-kube-api-access-927v8\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm\" (UID: \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.507145 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e70f5fe4-9f73-4919-87c7-2732d365bdd0-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm\" (UID: \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.507267 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e70f5fe4-9f73-4919-87c7-2732d365bdd0-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm\" (UID: \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.507735 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e70f5fe4-9f73-4919-87c7-2732d365bdd0-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm\" (UID: \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.507756 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e70f5fe4-9f73-4919-87c7-2732d365bdd0-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm\" (UID: \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.526028 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-927v8\" (UniqueName: \"kubernetes.io/projected/e70f5fe4-9f73-4919-87c7-2732d365bdd0-kube-api-access-927v8\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm\" (UID: \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.585821 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:08 crc kubenswrapper[4845]: I1006 06:55:08.758255 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm"] Oct 06 06:55:09 crc kubenswrapper[4845]: I1006 06:55:09.080731 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" event={"ID":"e70f5fe4-9f73-4919-87c7-2732d365bdd0","Type":"ContainerStarted","Data":"9610d08b10f5f35ffa827b6b09c6d4e89fdadcd05eb1e6053fa86373ef273717"} Oct 06 06:55:09 crc kubenswrapper[4845]: I1006 06:55:09.080778 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" event={"ID":"e70f5fe4-9f73-4919-87c7-2732d365bdd0","Type":"ContainerStarted","Data":"ec4f922d678ce795aed351026e29d998ba2d318b7e06bdd46838ca74a9a1d900"} Oct 06 06:55:10 crc kubenswrapper[4845]: I1006 06:55:10.087245 4845 generic.go:334] "Generic (PLEG): container finished" podID="e70f5fe4-9f73-4919-87c7-2732d365bdd0" containerID="9610d08b10f5f35ffa827b6b09c6d4e89fdadcd05eb1e6053fa86373ef273717" exitCode=0 Oct 06 06:55:10 crc kubenswrapper[4845]: I1006 06:55:10.087319 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" event={"ID":"e70f5fe4-9f73-4919-87c7-2732d365bdd0","Type":"ContainerDied","Data":"9610d08b10f5f35ffa827b6b09c6d4e89fdadcd05eb1e6053fa86373ef273717"} Oct 06 06:55:12 crc kubenswrapper[4845]: I1006 06:55:12.100020 4845 generic.go:334] "Generic (PLEG): container finished" podID="e70f5fe4-9f73-4919-87c7-2732d365bdd0" containerID="724caf354a5ee6b2911ee426967858259fdca7ee3ff5f88da8edfbe258c34217" exitCode=0 Oct 06 06:55:12 crc kubenswrapper[4845]: I1006 06:55:12.100059 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" event={"ID":"e70f5fe4-9f73-4919-87c7-2732d365bdd0","Type":"ContainerDied","Data":"724caf354a5ee6b2911ee426967858259fdca7ee3ff5f88da8edfbe258c34217"} Oct 06 06:55:13 crc kubenswrapper[4845]: I1006 06:55:13.108529 4845 generic.go:334] "Generic (PLEG): container finished" podID="e70f5fe4-9f73-4919-87c7-2732d365bdd0" containerID="9f9ea2ac163eb42919b8ae3a5055920b76a09bb44d39412654b627649aaa57be" exitCode=0 Oct 06 06:55:13 crc kubenswrapper[4845]: I1006 06:55:13.108577 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" event={"ID":"e70f5fe4-9f73-4919-87c7-2732d365bdd0","Type":"ContainerDied","Data":"9f9ea2ac163eb42919b8ae3a5055920b76a09bb44d39412654b627649aaa57be"} Oct 06 06:55:14 crc kubenswrapper[4845]: I1006 06:55:14.371094 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:14 crc kubenswrapper[4845]: I1006 06:55:14.480364 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e70f5fe4-9f73-4919-87c7-2732d365bdd0-util\") pod \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\" (UID: \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\") " Oct 06 06:55:14 crc kubenswrapper[4845]: I1006 06:55:14.480487 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e70f5fe4-9f73-4919-87c7-2732d365bdd0-bundle\") pod \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\" (UID: \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\") " Oct 06 06:55:14 crc kubenswrapper[4845]: I1006 06:55:14.480608 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-927v8\" (UniqueName: \"kubernetes.io/projected/e70f5fe4-9f73-4919-87c7-2732d365bdd0-kube-api-access-927v8\") pod \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\" (UID: \"e70f5fe4-9f73-4919-87c7-2732d365bdd0\") " Oct 06 06:55:14 crc kubenswrapper[4845]: I1006 06:55:14.481576 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e70f5fe4-9f73-4919-87c7-2732d365bdd0-bundle" (OuterVolumeSpecName: "bundle") pod "e70f5fe4-9f73-4919-87c7-2732d365bdd0" (UID: "e70f5fe4-9f73-4919-87c7-2732d365bdd0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:55:14 crc kubenswrapper[4845]: I1006 06:55:14.486362 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e70f5fe4-9f73-4919-87c7-2732d365bdd0-kube-api-access-927v8" (OuterVolumeSpecName: "kube-api-access-927v8") pod "e70f5fe4-9f73-4919-87c7-2732d365bdd0" (UID: "e70f5fe4-9f73-4919-87c7-2732d365bdd0"). InnerVolumeSpecName "kube-api-access-927v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:55:14 crc kubenswrapper[4845]: I1006 06:55:14.582805 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-927v8\" (UniqueName: \"kubernetes.io/projected/e70f5fe4-9f73-4919-87c7-2732d365bdd0-kube-api-access-927v8\") on node \"crc\" DevicePath \"\"" Oct 06 06:55:14 crc kubenswrapper[4845]: I1006 06:55:14.583072 4845 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e70f5fe4-9f73-4919-87c7-2732d365bdd0-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:55:14 crc kubenswrapper[4845]: I1006 06:55:14.824505 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e70f5fe4-9f73-4919-87c7-2732d365bdd0-util" (OuterVolumeSpecName: "util") pod "e70f5fe4-9f73-4919-87c7-2732d365bdd0" (UID: "e70f5fe4-9f73-4919-87c7-2732d365bdd0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:55:14 crc kubenswrapper[4845]: I1006 06:55:14.887838 4845 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e70f5fe4-9f73-4919-87c7-2732d365bdd0-util\") on node \"crc\" DevicePath \"\"" Oct 06 06:55:15 crc kubenswrapper[4845]: I1006 06:55:15.120391 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" event={"ID":"e70f5fe4-9f73-4919-87c7-2732d365bdd0","Type":"ContainerDied","Data":"ec4f922d678ce795aed351026e29d998ba2d318b7e06bdd46838ca74a9a1d900"} Oct 06 06:55:15 crc kubenswrapper[4845]: I1006 06:55:15.120447 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec4f922d678ce795aed351026e29d998ba2d318b7e06bdd46838ca74a9a1d900" Oct 06 06:55:15 crc kubenswrapper[4845]: I1006 06:55:15.120603 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm" Oct 06 06:55:19 crc kubenswrapper[4845]: I1006 06:55:19.799361 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-8lmwp"] Oct 06 06:55:19 crc kubenswrapper[4845]: E1006 06:55:19.799715 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70f5fe4-9f73-4919-87c7-2732d365bdd0" containerName="extract" Oct 06 06:55:19 crc kubenswrapper[4845]: I1006 06:55:19.799734 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70f5fe4-9f73-4919-87c7-2732d365bdd0" containerName="extract" Oct 06 06:55:19 crc kubenswrapper[4845]: E1006 06:55:19.799758 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70f5fe4-9f73-4919-87c7-2732d365bdd0" containerName="pull" Oct 06 06:55:19 crc kubenswrapper[4845]: I1006 06:55:19.799766 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70f5fe4-9f73-4919-87c7-2732d365bdd0" containerName="pull" Oct 06 06:55:19 crc kubenswrapper[4845]: E1006 06:55:19.799774 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70f5fe4-9f73-4919-87c7-2732d365bdd0" containerName="util" Oct 06 06:55:19 crc kubenswrapper[4845]: I1006 06:55:19.799783 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70f5fe4-9f73-4919-87c7-2732d365bdd0" containerName="util" Oct 06 06:55:19 crc kubenswrapper[4845]: I1006 06:55:19.799919 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70f5fe4-9f73-4919-87c7-2732d365bdd0" containerName="extract" Oct 06 06:55:19 crc kubenswrapper[4845]: I1006 06:55:19.800504 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lmwp" Oct 06 06:55:19 crc kubenswrapper[4845]: I1006 06:55:19.802244 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-6j6tc" Oct 06 06:55:19 crc kubenswrapper[4845]: I1006 06:55:19.803333 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 06 06:55:19 crc kubenswrapper[4845]: I1006 06:55:19.803367 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 06 06:55:19 crc kubenswrapper[4845]: I1006 06:55:19.819514 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-8lmwp"] Oct 06 06:55:19 crc kubenswrapper[4845]: I1006 06:55:19.947491 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2xdv\" (UniqueName: \"kubernetes.io/projected/55d7abcf-08e7-4529-8c8c-1f45ab1ea688-kube-api-access-t2xdv\") pod \"nmstate-operator-858ddd8f98-8lmwp\" (UID: \"55d7abcf-08e7-4529-8c8c-1f45ab1ea688\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lmwp" Oct 06 06:55:20 crc kubenswrapper[4845]: I1006 06:55:20.048163 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2xdv\" (UniqueName: \"kubernetes.io/projected/55d7abcf-08e7-4529-8c8c-1f45ab1ea688-kube-api-access-t2xdv\") pod \"nmstate-operator-858ddd8f98-8lmwp\" (UID: \"55d7abcf-08e7-4529-8c8c-1f45ab1ea688\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lmwp" Oct 06 06:55:20 crc kubenswrapper[4845]: I1006 06:55:20.066207 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2xdv\" (UniqueName: \"kubernetes.io/projected/55d7abcf-08e7-4529-8c8c-1f45ab1ea688-kube-api-access-t2xdv\") pod \"nmstate-operator-858ddd8f98-8lmwp\" (UID: \"55d7abcf-08e7-4529-8c8c-1f45ab1ea688\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lmwp" Oct 06 06:55:20 crc kubenswrapper[4845]: I1006 06:55:20.115948 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lmwp" Oct 06 06:55:20 crc kubenswrapper[4845]: I1006 06:55:20.366153 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-8lmwp"] Oct 06 06:55:20 crc kubenswrapper[4845]: W1006 06:55:20.387914 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55d7abcf_08e7_4529_8c8c_1f45ab1ea688.slice/crio-3b33c62d22b8b44719459909e944cb299941f2eccc41d48283cb10459cec2cf7 WatchSource:0}: Error finding container 3b33c62d22b8b44719459909e944cb299941f2eccc41d48283cb10459cec2cf7: Status 404 returned error can't find the container with id 3b33c62d22b8b44719459909e944cb299941f2eccc41d48283cb10459cec2cf7 Oct 06 06:55:21 crc kubenswrapper[4845]: I1006 06:55:21.154564 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lmwp" event={"ID":"55d7abcf-08e7-4529-8c8c-1f45ab1ea688","Type":"ContainerStarted","Data":"3b33c62d22b8b44719459909e944cb299941f2eccc41d48283cb10459cec2cf7"} Oct 06 06:55:23 crc kubenswrapper[4845]: I1006 06:55:23.164644 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lmwp" event={"ID":"55d7abcf-08e7-4529-8c8c-1f45ab1ea688","Type":"ContainerStarted","Data":"836c0de16c9f3df10d358d71b2c515ffb860ded2071c2532bd16d4d51166c79e"} Oct 06 06:55:23 crc kubenswrapper[4845]: I1006 06:55:23.180013 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-8lmwp" podStartSLOduration=2.22748412 podStartE2EDuration="4.179998197s" podCreationTimestamp="2025-10-06 06:55:19 +0000 UTC" firstStartedPulling="2025-10-06 06:55:20.389950582 +0000 UTC m=+604.904691590" lastFinishedPulling="2025-10-06 06:55:22.342464659 +0000 UTC m=+606.857205667" observedRunningTime="2025-10-06 06:55:23.176368427 +0000 UTC m=+607.691109455" watchObservedRunningTime="2025-10-06 06:55:23.179998197 +0000 UTC m=+607.694739205" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.501609 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-qsw2n"] Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.503112 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-qsw2n" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.508763 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-htf2w" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.513143 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-qsw2n"] Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.524185 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-bs789"] Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.525063 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.528008 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg"] Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.528971 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.530948 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.540271 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg"] Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.600759 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b-dbus-socket\") pod \"nmstate-handler-bs789\" (UID: \"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b\") " pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.600831 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9d129de3-ab21-48ea-a89c-0c59574eb288-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-s52dg\" (UID: \"9d129de3-ab21-48ea-a89c-0c59574eb288\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.600864 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sgjd\" (UniqueName: \"kubernetes.io/projected/9d129de3-ab21-48ea-a89c-0c59574eb288-kube-api-access-5sgjd\") pod \"nmstate-webhook-6cdbc54649-s52dg\" (UID: \"9d129de3-ab21-48ea-a89c-0c59574eb288\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.600896 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b-nmstate-lock\") pod \"nmstate-handler-bs789\" (UID: \"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b\") " pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.600925 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b-ovs-socket\") pod \"nmstate-handler-bs789\" (UID: \"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b\") " pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.600948 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmx8m\" (UniqueName: \"kubernetes.io/projected/a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b-kube-api-access-lmx8m\") pod \"nmstate-handler-bs789\" (UID: \"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b\") " pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.600968 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5g88\" (UniqueName: \"kubernetes.io/projected/f7a8f638-cc48-4cca-965a-c3d16476963c-kube-api-access-t5g88\") pod \"nmstate-metrics-fdff9cb8d-qsw2n\" (UID: \"f7a8f638-cc48-4cca-965a-c3d16476963c\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-qsw2n" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.622489 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8"] Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.623293 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.625235 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.625596 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-svrfx" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.625813 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.632934 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8"] Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.701697 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sgjd\" (UniqueName: \"kubernetes.io/projected/9d129de3-ab21-48ea-a89c-0c59574eb288-kube-api-access-5sgjd\") pod \"nmstate-webhook-6cdbc54649-s52dg\" (UID: \"9d129de3-ab21-48ea-a89c-0c59574eb288\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.701955 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b-nmstate-lock\") pod \"nmstate-handler-bs789\" (UID: \"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b\") " pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.702028 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b-ovs-socket\") pod \"nmstate-handler-bs789\" (UID: \"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b\") " pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.702061 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmx8m\" (UniqueName: \"kubernetes.io/projected/a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b-kube-api-access-lmx8m\") pod \"nmstate-handler-bs789\" (UID: \"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b\") " pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.702075 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b-nmstate-lock\") pod \"nmstate-handler-bs789\" (UID: \"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b\") " pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.702122 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5g88\" (UniqueName: \"kubernetes.io/projected/f7a8f638-cc48-4cca-965a-c3d16476963c-kube-api-access-t5g88\") pod \"nmstate-metrics-fdff9cb8d-qsw2n\" (UID: \"f7a8f638-cc48-4cca-965a-c3d16476963c\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-qsw2n" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.702213 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b-dbus-socket\") pod \"nmstate-handler-bs789\" (UID: \"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b\") " pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.702284 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9d129de3-ab21-48ea-a89c-0c59574eb288-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-s52dg\" (UID: \"9d129de3-ab21-48ea-a89c-0c59574eb288\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.702188 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b-ovs-socket\") pod \"nmstate-handler-bs789\" (UID: \"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b\") " pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: E1006 06:55:28.702460 4845 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 06 06:55:28 crc kubenswrapper[4845]: E1006 06:55:28.703815 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d129de3-ab21-48ea-a89c-0c59574eb288-tls-key-pair podName:9d129de3-ab21-48ea-a89c-0c59574eb288 nodeName:}" failed. No retries permitted until 2025-10-06 06:55:29.203792416 +0000 UTC m=+613.718533424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/9d129de3-ab21-48ea-a89c-0c59574eb288-tls-key-pair") pod "nmstate-webhook-6cdbc54649-s52dg" (UID: "9d129de3-ab21-48ea-a89c-0c59574eb288") : secret "openshift-nmstate-webhook" not found Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.702537 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b-dbus-socket\") pod \"nmstate-handler-bs789\" (UID: \"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b\") " pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.722281 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sgjd\" (UniqueName: \"kubernetes.io/projected/9d129de3-ab21-48ea-a89c-0c59574eb288-kube-api-access-5sgjd\") pod \"nmstate-webhook-6cdbc54649-s52dg\" (UID: \"9d129de3-ab21-48ea-a89c-0c59574eb288\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.722291 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5g88\" (UniqueName: \"kubernetes.io/projected/f7a8f638-cc48-4cca-965a-c3d16476963c-kube-api-access-t5g88\") pod \"nmstate-metrics-fdff9cb8d-qsw2n\" (UID: \"f7a8f638-cc48-4cca-965a-c3d16476963c\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-qsw2n" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.723749 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmx8m\" (UniqueName: \"kubernetes.io/projected/a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b-kube-api-access-lmx8m\") pod \"nmstate-handler-bs789\" (UID: \"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b\") " pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.802753 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m27mq\" (UniqueName: \"kubernetes.io/projected/e39da7c7-c03a-458b-9f86-ef7a914b900a-kube-api-access-m27mq\") pod \"nmstate-console-plugin-6b874cbd85-grzq8\" (UID: \"e39da7c7-c03a-458b-9f86-ef7a914b900a\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.802816 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e39da7c7-c03a-458b-9f86-ef7a914b900a-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-grzq8\" (UID: \"e39da7c7-c03a-458b-9f86-ef7a914b900a\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.802868 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e39da7c7-c03a-458b-9f86-ef7a914b900a-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-grzq8\" (UID: \"e39da7c7-c03a-458b-9f86-ef7a914b900a\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.806365 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cc6bd8d84-bbvsk"] Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.807047 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.820364 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-qsw2n" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.825111 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cc6bd8d84-bbvsk"] Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.851954 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.903802 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c1325ba-9fc6-451e-a494-40323ccb2596-console-serving-cert\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.903849 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c1325ba-9fc6-451e-a494-40323ccb2596-console-oauth-config\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.903884 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e39da7c7-c03a-458b-9f86-ef7a914b900a-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-grzq8\" (UID: \"e39da7c7-c03a-458b-9f86-ef7a914b900a\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.903910 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c1325ba-9fc6-451e-a494-40323ccb2596-console-config\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.903983 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2n7z\" (UniqueName: \"kubernetes.io/projected/5c1325ba-9fc6-451e-a494-40323ccb2596-kube-api-access-k2n7z\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.904114 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m27mq\" (UniqueName: \"kubernetes.io/projected/e39da7c7-c03a-458b-9f86-ef7a914b900a-kube-api-access-m27mq\") pod \"nmstate-console-plugin-6b874cbd85-grzq8\" (UID: \"e39da7c7-c03a-458b-9f86-ef7a914b900a\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.904171 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c1325ba-9fc6-451e-a494-40323ccb2596-service-ca\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.904207 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c1325ba-9fc6-451e-a494-40323ccb2596-trusted-ca-bundle\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.904234 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e39da7c7-c03a-458b-9f86-ef7a914b900a-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-grzq8\" (UID: \"e39da7c7-c03a-458b-9f86-ef7a914b900a\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.904253 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c1325ba-9fc6-451e-a494-40323ccb2596-oauth-serving-cert\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.904764 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e39da7c7-c03a-458b-9f86-ef7a914b900a-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-grzq8\" (UID: \"e39da7c7-c03a-458b-9f86-ef7a914b900a\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.909764 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e39da7c7-c03a-458b-9f86-ef7a914b900a-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-grzq8\" (UID: \"e39da7c7-c03a-458b-9f86-ef7a914b900a\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" Oct 06 06:55:28 crc kubenswrapper[4845]: I1006 06:55:28.944856 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m27mq\" (UniqueName: \"kubernetes.io/projected/e39da7c7-c03a-458b-9f86-ef7a914b900a-kube-api-access-m27mq\") pod \"nmstate-console-plugin-6b874cbd85-grzq8\" (UID: \"e39da7c7-c03a-458b-9f86-ef7a914b900a\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.005209 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c1325ba-9fc6-451e-a494-40323ccb2596-service-ca\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.005261 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c1325ba-9fc6-451e-a494-40323ccb2596-trusted-ca-bundle\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.005282 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c1325ba-9fc6-451e-a494-40323ccb2596-oauth-serving-cert\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.005327 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c1325ba-9fc6-451e-a494-40323ccb2596-console-serving-cert\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.005348 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c1325ba-9fc6-451e-a494-40323ccb2596-console-oauth-config\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.005389 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c1325ba-9fc6-451e-a494-40323ccb2596-console-config\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.005406 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2n7z\" (UniqueName: \"kubernetes.io/projected/5c1325ba-9fc6-451e-a494-40323ccb2596-kube-api-access-k2n7z\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.006805 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c1325ba-9fc6-451e-a494-40323ccb2596-oauth-serving-cert\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.006956 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c1325ba-9fc6-451e-a494-40323ccb2596-console-config\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.007393 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c1325ba-9fc6-451e-a494-40323ccb2596-service-ca\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.007432 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c1325ba-9fc6-451e-a494-40323ccb2596-trusted-ca-bundle\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.008608 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c1325ba-9fc6-451e-a494-40323ccb2596-console-oauth-config\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.008907 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c1325ba-9fc6-451e-a494-40323ccb2596-console-serving-cert\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.025204 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2n7z\" (UniqueName: \"kubernetes.io/projected/5c1325ba-9fc6-451e-a494-40323ccb2596-kube-api-access-k2n7z\") pod \"console-6cc6bd8d84-bbvsk\" (UID: \"5c1325ba-9fc6-451e-a494-40323ccb2596\") " pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.088313 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-qsw2n"] Oct 06 06:55:29 crc kubenswrapper[4845]: W1006 06:55:29.093493 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7a8f638_cc48_4cca_965a_c3d16476963c.slice/crio-30ccb66e46faf50c57fa4c122dce9d22d6538d9482cd9204f739383233bcef48 WatchSource:0}: Error finding container 30ccb66e46faf50c57fa4c122dce9d22d6538d9482cd9204f739383233bcef48: Status 404 returned error can't find the container with id 30ccb66e46faf50c57fa4c122dce9d22d6538d9482cd9204f739383233bcef48 Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.120431 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.199008 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-qsw2n" event={"ID":"f7a8f638-cc48-4cca-965a-c3d16476963c","Type":"ContainerStarted","Data":"30ccb66e46faf50c57fa4c122dce9d22d6538d9482cd9204f739383233bcef48"} Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.201545 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bs789" event={"ID":"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b","Type":"ContainerStarted","Data":"a70d532be4d915421518701ef75da4889da611690a9af4e6ae61fdaae69f4129"} Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.207329 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9d129de3-ab21-48ea-a89c-0c59574eb288-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-s52dg\" (UID: \"9d129de3-ab21-48ea-a89c-0c59574eb288\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.210773 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9d129de3-ab21-48ea-a89c-0c59574eb288-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-s52dg\" (UID: \"9d129de3-ab21-48ea-a89c-0c59574eb288\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.244669 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.391849 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8"] Oct 06 06:55:29 crc kubenswrapper[4845]: W1006 06:55:29.395917 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode39da7c7_c03a_458b_9f86_ef7a914b900a.slice/crio-e2e5ce61b31acb53389366746712b174d3783b1af24fcc8be215e1bfaf6ae124 WatchSource:0}: Error finding container e2e5ce61b31acb53389366746712b174d3783b1af24fcc8be215e1bfaf6ae124: Status 404 returned error can't find the container with id e2e5ce61b31acb53389366746712b174d3783b1af24fcc8be215e1bfaf6ae124 Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.460945 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.493253 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cc6bd8d84-bbvsk"] Oct 06 06:55:29 crc kubenswrapper[4845]: W1006 06:55:29.499926 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c1325ba_9fc6_451e_a494_40323ccb2596.slice/crio-f74404b7b54ba424f41a5b51f33dd52e1a9d6050b84a8ee9174002f380ee0437 WatchSource:0}: Error finding container f74404b7b54ba424f41a5b51f33dd52e1a9d6050b84a8ee9174002f380ee0437: Status 404 returned error can't find the container with id f74404b7b54ba424f41a5b51f33dd52e1a9d6050b84a8ee9174002f380ee0437 Oct 06 06:55:29 crc kubenswrapper[4845]: I1006 06:55:29.620709 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg"] Oct 06 06:55:29 crc kubenswrapper[4845]: W1006 06:55:29.625117 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d129de3_ab21_48ea_a89c_0c59574eb288.slice/crio-513af5f4eed45a9d58b08e79cb6b89546db774431c287bf615dca6646f93f217 WatchSource:0}: Error finding container 513af5f4eed45a9d58b08e79cb6b89546db774431c287bf615dca6646f93f217: Status 404 returned error can't find the container with id 513af5f4eed45a9d58b08e79cb6b89546db774431c287bf615dca6646f93f217 Oct 06 06:55:30 crc kubenswrapper[4845]: I1006 06:55:30.209772 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cc6bd8d84-bbvsk" event={"ID":"5c1325ba-9fc6-451e-a494-40323ccb2596","Type":"ContainerStarted","Data":"9d88dcc38aca06c3438f21780af23e0769591adb08b1dbaef5b31005f0f3c6dc"} Oct 06 06:55:30 crc kubenswrapper[4845]: I1006 06:55:30.209835 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cc6bd8d84-bbvsk" event={"ID":"5c1325ba-9fc6-451e-a494-40323ccb2596","Type":"ContainerStarted","Data":"f74404b7b54ba424f41a5b51f33dd52e1a9d6050b84a8ee9174002f380ee0437"} Oct 06 06:55:30 crc kubenswrapper[4845]: I1006 06:55:30.210808 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" event={"ID":"e39da7c7-c03a-458b-9f86-ef7a914b900a","Type":"ContainerStarted","Data":"e2e5ce61b31acb53389366746712b174d3783b1af24fcc8be215e1bfaf6ae124"} Oct 06 06:55:30 crc kubenswrapper[4845]: I1006 06:55:30.211730 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" event={"ID":"9d129de3-ab21-48ea-a89c-0c59574eb288","Type":"ContainerStarted","Data":"513af5f4eed45a9d58b08e79cb6b89546db774431c287bf615dca6646f93f217"} Oct 06 06:55:30 crc kubenswrapper[4845]: I1006 06:55:30.229075 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cc6bd8d84-bbvsk" podStartSLOduration=2.22905229 podStartE2EDuration="2.22905229s" podCreationTimestamp="2025-10-06 06:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:55:30.225042811 +0000 UTC m=+614.739783829" watchObservedRunningTime="2025-10-06 06:55:30.22905229 +0000 UTC m=+614.743793298" Oct 06 06:55:33 crc kubenswrapper[4845]: I1006 06:55:33.231132 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" event={"ID":"9d129de3-ab21-48ea-a89c-0c59574eb288","Type":"ContainerStarted","Data":"f086920bbc32859d06c6c5f7a15a3797cdc13844e73e664b032137e08112495e"} Oct 06 06:55:33 crc kubenswrapper[4845]: I1006 06:55:33.231557 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" Oct 06 06:55:33 crc kubenswrapper[4845]: I1006 06:55:33.232949 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bs789" event={"ID":"a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b","Type":"ContainerStarted","Data":"555aa9d9980b3ce374d9226b9db9eecc810373f36ef6700f6308d9f16c42d83d"} Oct 06 06:55:33 crc kubenswrapper[4845]: I1006 06:55:33.233066 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:33 crc kubenswrapper[4845]: I1006 06:55:33.234269 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-qsw2n" event={"ID":"f7a8f638-cc48-4cca-965a-c3d16476963c","Type":"ContainerStarted","Data":"c9a024bcca79e5ece647e60ec225fd79a8ddaae658b2857b731368240e442bcd"} Oct 06 06:55:33 crc kubenswrapper[4845]: I1006 06:55:33.253081 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" podStartSLOduration=2.134226217 podStartE2EDuration="5.253064573s" podCreationTimestamp="2025-10-06 06:55:28 +0000 UTC" firstStartedPulling="2025-10-06 06:55:29.62704242 +0000 UTC m=+614.141783428" lastFinishedPulling="2025-10-06 06:55:32.745880776 +0000 UTC m=+617.260621784" observedRunningTime="2025-10-06 06:55:33.248934011 +0000 UTC m=+617.763675049" watchObservedRunningTime="2025-10-06 06:55:33.253064573 +0000 UTC m=+617.767805581" Oct 06 06:55:33 crc kubenswrapper[4845]: I1006 06:55:33.271503 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-bs789" podStartSLOduration=1.42421103 podStartE2EDuration="5.271460028s" podCreationTimestamp="2025-10-06 06:55:28 +0000 UTC" firstStartedPulling="2025-10-06 06:55:28.891990154 +0000 UTC m=+613.406731162" lastFinishedPulling="2025-10-06 06:55:32.739239152 +0000 UTC m=+617.253980160" observedRunningTime="2025-10-06 06:55:33.268089514 +0000 UTC m=+617.782830522" watchObservedRunningTime="2025-10-06 06:55:33.271460028 +0000 UTC m=+617.786201056" Oct 06 06:55:34 crc kubenswrapper[4845]: I1006 06:55:34.239874 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" event={"ID":"e39da7c7-c03a-458b-9f86-ef7a914b900a","Type":"ContainerStarted","Data":"c60d5e91c9dfac18f8bb1f968ccacbc03a8c72ad4d402083c1ef73fadfce5e57"} Oct 06 06:55:34 crc kubenswrapper[4845]: I1006 06:55:34.257047 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-grzq8" podStartSLOduration=2.073410055 podStartE2EDuration="6.257031481s" podCreationTimestamp="2025-10-06 06:55:28 +0000 UTC" firstStartedPulling="2025-10-06 06:55:29.399415838 +0000 UTC m=+613.914156856" lastFinishedPulling="2025-10-06 06:55:33.583037274 +0000 UTC m=+618.097778282" observedRunningTime="2025-10-06 06:55:34.255077673 +0000 UTC m=+618.769818691" watchObservedRunningTime="2025-10-06 06:55:34.257031481 +0000 UTC m=+618.771772489" Oct 06 06:55:36 crc kubenswrapper[4845]: I1006 06:55:36.253297 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-qsw2n" event={"ID":"f7a8f638-cc48-4cca-965a-c3d16476963c","Type":"ContainerStarted","Data":"388221896c66e6e458a42c0771ae72424ecb3606019c8e92cef457e82fb485fd"} Oct 06 06:55:36 crc kubenswrapper[4845]: I1006 06:55:36.273705 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-qsw2n" podStartSLOduration=2.12221002 podStartE2EDuration="8.273690132s" podCreationTimestamp="2025-10-06 06:55:28 +0000 UTC" firstStartedPulling="2025-10-06 06:55:29.094866295 +0000 UTC m=+613.609607303" lastFinishedPulling="2025-10-06 06:55:35.246346407 +0000 UTC m=+619.761087415" observedRunningTime="2025-10-06 06:55:36.272429761 +0000 UTC m=+620.787170769" watchObservedRunningTime="2025-10-06 06:55:36.273690132 +0000 UTC m=+620.788431140" Oct 06 06:55:38 crc kubenswrapper[4845]: I1006 06:55:38.874328 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-bs789" Oct 06 06:55:39 crc kubenswrapper[4845]: I1006 06:55:39.121257 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:39 crc kubenswrapper[4845]: I1006 06:55:39.121299 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:39 crc kubenswrapper[4845]: I1006 06:55:39.125932 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:39 crc kubenswrapper[4845]: I1006 06:55:39.274881 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cc6bd8d84-bbvsk" Oct 06 06:55:39 crc kubenswrapper[4845]: I1006 06:55:39.331528 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v6dlz"] Oct 06 06:55:49 crc kubenswrapper[4845]: I1006 06:55:49.466730 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-s52dg" Oct 06 06:56:03 crc kubenswrapper[4845]: I1006 06:56:03.859509 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d"] Oct 06 06:56:03 crc kubenswrapper[4845]: I1006 06:56:03.862677 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:03 crc kubenswrapper[4845]: I1006 06:56:03.866171 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 06:56:03 crc kubenswrapper[4845]: I1006 06:56:03.872076 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d"] Oct 06 06:56:03 crc kubenswrapper[4845]: I1006 06:56:03.999442 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/602b8691-5aec-4f79-b690-a517191505b0-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d\" (UID: \"602b8691-5aec-4f79-b690-a517191505b0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:03 crc kubenswrapper[4845]: I1006 06:56:03.999508 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn8ss\" (UniqueName: \"kubernetes.io/projected/602b8691-5aec-4f79-b690-a517191505b0-kube-api-access-sn8ss\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d\" (UID: \"602b8691-5aec-4f79-b690-a517191505b0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:03 crc kubenswrapper[4845]: I1006 06:56:03.999536 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/602b8691-5aec-4f79-b690-a517191505b0-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d\" (UID: \"602b8691-5aec-4f79-b690-a517191505b0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.101422 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/602b8691-5aec-4f79-b690-a517191505b0-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d\" (UID: \"602b8691-5aec-4f79-b690-a517191505b0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.101471 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8ss\" (UniqueName: \"kubernetes.io/projected/602b8691-5aec-4f79-b690-a517191505b0-kube-api-access-sn8ss\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d\" (UID: \"602b8691-5aec-4f79-b690-a517191505b0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.101491 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/602b8691-5aec-4f79-b690-a517191505b0-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d\" (UID: \"602b8691-5aec-4f79-b690-a517191505b0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.101905 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/602b8691-5aec-4f79-b690-a517191505b0-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d\" (UID: \"602b8691-5aec-4f79-b690-a517191505b0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.101970 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/602b8691-5aec-4f79-b690-a517191505b0-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d\" (UID: \"602b8691-5aec-4f79-b690-a517191505b0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.124507 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn8ss\" (UniqueName: \"kubernetes.io/projected/602b8691-5aec-4f79-b690-a517191505b0-kube-api-access-sn8ss\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d\" (UID: \"602b8691-5aec-4f79-b690-a517191505b0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.186919 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.376797 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-v6dlz" podUID="55d881a0-9c07-42e7-aed8-8a883c4b1ff5" containerName="console" containerID="cri-o://557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df" gracePeriod=15 Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.600309 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d"] Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.738928 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v6dlz_55d881a0-9c07-42e7-aed8-8a883c4b1ff5/console/0.log" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.739008 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.809561 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-config\") pod \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.809612 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-service-ca\") pod \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.809689 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdtl6\" (UniqueName: \"kubernetes.io/projected/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-kube-api-access-xdtl6\") pod \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.809719 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-oauth-serving-cert\") pod \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.809737 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-oauth-config\") pod \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.809786 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-trusted-ca-bundle\") pod \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.809812 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-serving-cert\") pod \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\" (UID: \"55d881a0-9c07-42e7-aed8-8a883c4b1ff5\") " Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.810646 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "55d881a0-9c07-42e7-aed8-8a883c4b1ff5" (UID: "55d881a0-9c07-42e7-aed8-8a883c4b1ff5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.810687 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-service-ca" (OuterVolumeSpecName: "service-ca") pod "55d881a0-9c07-42e7-aed8-8a883c4b1ff5" (UID: "55d881a0-9c07-42e7-aed8-8a883c4b1ff5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.811196 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-config" (OuterVolumeSpecName: "console-config") pod "55d881a0-9c07-42e7-aed8-8a883c4b1ff5" (UID: "55d881a0-9c07-42e7-aed8-8a883c4b1ff5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.811318 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "55d881a0-9c07-42e7-aed8-8a883c4b1ff5" (UID: "55d881a0-9c07-42e7-aed8-8a883c4b1ff5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.814838 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "55d881a0-9c07-42e7-aed8-8a883c4b1ff5" (UID: "55d881a0-9c07-42e7-aed8-8a883c4b1ff5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.814892 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-kube-api-access-xdtl6" (OuterVolumeSpecName: "kube-api-access-xdtl6") pod "55d881a0-9c07-42e7-aed8-8a883c4b1ff5" (UID: "55d881a0-9c07-42e7-aed8-8a883c4b1ff5"). InnerVolumeSpecName "kube-api-access-xdtl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.815156 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "55d881a0-9c07-42e7-aed8-8a883c4b1ff5" (UID: "55d881a0-9c07-42e7-aed8-8a883c4b1ff5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.911629 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.911685 4845 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.911697 4845 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.911706 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.911714 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdtl6\" (UniqueName: \"kubernetes.io/projected/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-kube-api-access-xdtl6\") on node \"crc\" DevicePath \"\"" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.911724 4845 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:56:04 crc kubenswrapper[4845]: I1006 06:56:04.911732 4845 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55d881a0-9c07-42e7-aed8-8a883c4b1ff5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.418640 4845 generic.go:334] "Generic (PLEG): container finished" podID="602b8691-5aec-4f79-b690-a517191505b0" containerID="073fb89a984b4314051386307842a876027bec0d261e8df24bb663bc3663e49f" exitCode=0 Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.418729 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" event={"ID":"602b8691-5aec-4f79-b690-a517191505b0","Type":"ContainerDied","Data":"073fb89a984b4314051386307842a876027bec0d261e8df24bb663bc3663e49f"} Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.418761 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" event={"ID":"602b8691-5aec-4f79-b690-a517191505b0","Type":"ContainerStarted","Data":"75c00e1ebe606f2b55c52480b3898fd882ea86c8430f5530df929a8fab04f73e"} Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.421149 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v6dlz_55d881a0-9c07-42e7-aed8-8a883c4b1ff5/console/0.log" Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.421246 4845 generic.go:334] "Generic (PLEG): container finished" podID="55d881a0-9c07-42e7-aed8-8a883c4b1ff5" containerID="557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df" exitCode=2 Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.421320 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v6dlz" event={"ID":"55d881a0-9c07-42e7-aed8-8a883c4b1ff5","Type":"ContainerDied","Data":"557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df"} Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.421344 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v6dlz" Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.421401 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v6dlz" event={"ID":"55d881a0-9c07-42e7-aed8-8a883c4b1ff5","Type":"ContainerDied","Data":"30aa4041ac1fa882f8ad9834497173e95378fddda515c703e54666c9def1b2f0"} Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.421451 4845 scope.go:117] "RemoveContainer" containerID="557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df" Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.460996 4845 scope.go:117] "RemoveContainer" containerID="557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df" Oct 06 06:56:05 crc kubenswrapper[4845]: E1006 06:56:05.461582 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df\": container with ID starting with 557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df not found: ID does not exist" containerID="557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df" Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.461647 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df"} err="failed to get container status \"557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df\": rpc error: code = NotFound desc = could not find container \"557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df\": container with ID starting with 557da3782f0adf2ba3139a4ee5a305565d54aa8c86409ec67047181a3d8260df not found: ID does not exist" Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.470991 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v6dlz"] Oct 06 06:56:05 crc kubenswrapper[4845]: I1006 06:56:05.476621 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-v6dlz"] Oct 06 06:56:06 crc kubenswrapper[4845]: I1006 06:56:06.238157 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d881a0-9c07-42e7-aed8-8a883c4b1ff5" path="/var/lib/kubelet/pods/55d881a0-9c07-42e7-aed8-8a883c4b1ff5/volumes" Oct 06 06:56:07 crc kubenswrapper[4845]: I1006 06:56:07.440803 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" event={"ID":"602b8691-5aec-4f79-b690-a517191505b0","Type":"ContainerStarted","Data":"e677621646733a35d4576c0513374bbc63f3c33b9fb6c5917eee3132c6507633"} Oct 06 06:56:08 crc kubenswrapper[4845]: I1006 06:56:08.449727 4845 generic.go:334] "Generic (PLEG): container finished" podID="602b8691-5aec-4f79-b690-a517191505b0" containerID="e677621646733a35d4576c0513374bbc63f3c33b9fb6c5917eee3132c6507633" exitCode=0 Oct 06 06:56:08 crc kubenswrapper[4845]: I1006 06:56:08.449776 4845 generic.go:334] "Generic (PLEG): container finished" podID="602b8691-5aec-4f79-b690-a517191505b0" containerID="e049b115be44e13470cfe3fe8e1245b7deee83cf2f7fa93ad193aae3728e6abb" exitCode=0 Oct 06 06:56:08 crc kubenswrapper[4845]: I1006 06:56:08.449809 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" event={"ID":"602b8691-5aec-4f79-b690-a517191505b0","Type":"ContainerDied","Data":"e677621646733a35d4576c0513374bbc63f3c33b9fb6c5917eee3132c6507633"} Oct 06 06:56:08 crc kubenswrapper[4845]: I1006 06:56:08.449853 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" event={"ID":"602b8691-5aec-4f79-b690-a517191505b0","Type":"ContainerDied","Data":"e049b115be44e13470cfe3fe8e1245b7deee83cf2f7fa93ad193aae3728e6abb"} Oct 06 06:56:09 crc kubenswrapper[4845]: I1006 06:56:09.665184 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:09 crc kubenswrapper[4845]: I1006 06:56:09.789696 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/602b8691-5aec-4f79-b690-a517191505b0-util\") pod \"602b8691-5aec-4f79-b690-a517191505b0\" (UID: \"602b8691-5aec-4f79-b690-a517191505b0\") " Oct 06 06:56:09 crc kubenswrapper[4845]: I1006 06:56:09.789741 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn8ss\" (UniqueName: \"kubernetes.io/projected/602b8691-5aec-4f79-b690-a517191505b0-kube-api-access-sn8ss\") pod \"602b8691-5aec-4f79-b690-a517191505b0\" (UID: \"602b8691-5aec-4f79-b690-a517191505b0\") " Oct 06 06:56:09 crc kubenswrapper[4845]: I1006 06:56:09.789795 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/602b8691-5aec-4f79-b690-a517191505b0-bundle\") pod \"602b8691-5aec-4f79-b690-a517191505b0\" (UID: \"602b8691-5aec-4f79-b690-a517191505b0\") " Oct 06 06:56:09 crc kubenswrapper[4845]: I1006 06:56:09.790801 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/602b8691-5aec-4f79-b690-a517191505b0-bundle" (OuterVolumeSpecName: "bundle") pod "602b8691-5aec-4f79-b690-a517191505b0" (UID: "602b8691-5aec-4f79-b690-a517191505b0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:56:09 crc kubenswrapper[4845]: I1006 06:56:09.795730 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/602b8691-5aec-4f79-b690-a517191505b0-kube-api-access-sn8ss" (OuterVolumeSpecName: "kube-api-access-sn8ss") pod "602b8691-5aec-4f79-b690-a517191505b0" (UID: "602b8691-5aec-4f79-b690-a517191505b0"). InnerVolumeSpecName "kube-api-access-sn8ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:56:09 crc kubenswrapper[4845]: I1006 06:56:09.805682 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/602b8691-5aec-4f79-b690-a517191505b0-util" (OuterVolumeSpecName: "util") pod "602b8691-5aec-4f79-b690-a517191505b0" (UID: "602b8691-5aec-4f79-b690-a517191505b0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:56:09 crc kubenswrapper[4845]: I1006 06:56:09.890917 4845 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/602b8691-5aec-4f79-b690-a517191505b0-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:56:09 crc kubenswrapper[4845]: I1006 06:56:09.890957 4845 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/602b8691-5aec-4f79-b690-a517191505b0-util\") on node \"crc\" DevicePath \"\"" Oct 06 06:56:09 crc kubenswrapper[4845]: I1006 06:56:09.890969 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn8ss\" (UniqueName: \"kubernetes.io/projected/602b8691-5aec-4f79-b690-a517191505b0-kube-api-access-sn8ss\") on node \"crc\" DevicePath \"\"" Oct 06 06:56:10 crc kubenswrapper[4845]: I1006 06:56:10.465513 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" event={"ID":"602b8691-5aec-4f79-b690-a517191505b0","Type":"ContainerDied","Data":"75c00e1ebe606f2b55c52480b3898fd882ea86c8430f5530df929a8fab04f73e"} Oct 06 06:56:10 crc kubenswrapper[4845]: I1006 06:56:10.465941 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75c00e1ebe606f2b55c52480b3898fd882ea86c8430f5530df929a8fab04f73e" Oct 06 06:56:10 crc kubenswrapper[4845]: I1006 06:56:10.465559 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.873287 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv"] Oct 06 06:56:18 crc kubenswrapper[4845]: E1006 06:56:18.874888 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d881a0-9c07-42e7-aed8-8a883c4b1ff5" containerName="console" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.874953 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d881a0-9c07-42e7-aed8-8a883c4b1ff5" containerName="console" Oct 06 06:56:18 crc kubenswrapper[4845]: E1006 06:56:18.875007 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="602b8691-5aec-4f79-b690-a517191505b0" containerName="pull" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.875054 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="602b8691-5aec-4f79-b690-a517191505b0" containerName="pull" Oct 06 06:56:18 crc kubenswrapper[4845]: E1006 06:56:18.875110 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="602b8691-5aec-4f79-b690-a517191505b0" containerName="extract" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.875173 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="602b8691-5aec-4f79-b690-a517191505b0" containerName="extract" Oct 06 06:56:18 crc kubenswrapper[4845]: E1006 06:56:18.875224 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="602b8691-5aec-4f79-b690-a517191505b0" containerName="util" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.875271 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="602b8691-5aec-4f79-b690-a517191505b0" containerName="util" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.875462 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="602b8691-5aec-4f79-b690-a517191505b0" containerName="extract" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.875525 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d881a0-9c07-42e7-aed8-8a883c4b1ff5" containerName="console" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.875997 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.879943 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-9g876" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.880190 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.880402 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.880823 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.881521 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 06 06:56:18 crc kubenswrapper[4845]: I1006 06:56:18.890176 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv"] Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.042032 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5501cc41-c16f-423f-a782-96e0d186f44e-webhook-cert\") pod \"metallb-operator-controller-manager-687d5696cb-xscrv\" (UID: \"5501cc41-c16f-423f-a782-96e0d186f44e\") " pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.042127 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c5cg\" (UniqueName: \"kubernetes.io/projected/5501cc41-c16f-423f-a782-96e0d186f44e-kube-api-access-6c5cg\") pod \"metallb-operator-controller-manager-687d5696cb-xscrv\" (UID: \"5501cc41-c16f-423f-a782-96e0d186f44e\") " pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.042163 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5501cc41-c16f-423f-a782-96e0d186f44e-apiservice-cert\") pod \"metallb-operator-controller-manager-687d5696cb-xscrv\" (UID: \"5501cc41-c16f-423f-a782-96e0d186f44e\") " pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.142938 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5501cc41-c16f-423f-a782-96e0d186f44e-apiservice-cert\") pod \"metallb-operator-controller-manager-687d5696cb-xscrv\" (UID: \"5501cc41-c16f-423f-a782-96e0d186f44e\") " pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.142991 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5501cc41-c16f-423f-a782-96e0d186f44e-webhook-cert\") pod \"metallb-operator-controller-manager-687d5696cb-xscrv\" (UID: \"5501cc41-c16f-423f-a782-96e0d186f44e\") " pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.143083 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c5cg\" (UniqueName: \"kubernetes.io/projected/5501cc41-c16f-423f-a782-96e0d186f44e-kube-api-access-6c5cg\") pod \"metallb-operator-controller-manager-687d5696cb-xscrv\" (UID: \"5501cc41-c16f-423f-a782-96e0d186f44e\") " pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.150994 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5501cc41-c16f-423f-a782-96e0d186f44e-apiservice-cert\") pod \"metallb-operator-controller-manager-687d5696cb-xscrv\" (UID: \"5501cc41-c16f-423f-a782-96e0d186f44e\") " pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.151870 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5501cc41-c16f-423f-a782-96e0d186f44e-webhook-cert\") pod \"metallb-operator-controller-manager-687d5696cb-xscrv\" (UID: \"5501cc41-c16f-423f-a782-96e0d186f44e\") " pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.162461 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c5cg\" (UniqueName: \"kubernetes.io/projected/5501cc41-c16f-423f-a782-96e0d186f44e-kube-api-access-6c5cg\") pod \"metallb-operator-controller-manager-687d5696cb-xscrv\" (UID: \"5501cc41-c16f-423f-a782-96e0d186f44e\") " pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.202201 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.429836 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms"] Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.431251 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.440186 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bczww" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.443576 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.443754 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.443798 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms"] Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.457995 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv"] Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.549954 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f925eeb5-fe5d-4479-9c06-be3069abc88d-apiservice-cert\") pod \"metallb-operator-webhook-server-7c77dc94b4-h9wms\" (UID: \"f925eeb5-fe5d-4479-9c06-be3069abc88d\") " pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.550038 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nbxz\" (UniqueName: \"kubernetes.io/projected/f925eeb5-fe5d-4479-9c06-be3069abc88d-kube-api-access-9nbxz\") pod \"metallb-operator-webhook-server-7c77dc94b4-h9wms\" (UID: \"f925eeb5-fe5d-4479-9c06-be3069abc88d\") " pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.550080 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f925eeb5-fe5d-4479-9c06-be3069abc88d-webhook-cert\") pod \"metallb-operator-webhook-server-7c77dc94b4-h9wms\" (UID: \"f925eeb5-fe5d-4479-9c06-be3069abc88d\") " pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.557065 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" event={"ID":"5501cc41-c16f-423f-a782-96e0d186f44e","Type":"ContainerStarted","Data":"41c68670514bcd51f84f894c78ecff2f59fdb3a41359e48d50a872cc0c59b34d"} Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.651558 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f925eeb5-fe5d-4479-9c06-be3069abc88d-webhook-cert\") pod \"metallb-operator-webhook-server-7c77dc94b4-h9wms\" (UID: \"f925eeb5-fe5d-4479-9c06-be3069abc88d\") " pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.651615 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f925eeb5-fe5d-4479-9c06-be3069abc88d-apiservice-cert\") pod \"metallb-operator-webhook-server-7c77dc94b4-h9wms\" (UID: \"f925eeb5-fe5d-4479-9c06-be3069abc88d\") " pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.651669 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nbxz\" (UniqueName: \"kubernetes.io/projected/f925eeb5-fe5d-4479-9c06-be3069abc88d-kube-api-access-9nbxz\") pod \"metallb-operator-webhook-server-7c77dc94b4-h9wms\" (UID: \"f925eeb5-fe5d-4479-9c06-be3069abc88d\") " pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.656385 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f925eeb5-fe5d-4479-9c06-be3069abc88d-apiservice-cert\") pod \"metallb-operator-webhook-server-7c77dc94b4-h9wms\" (UID: \"f925eeb5-fe5d-4479-9c06-be3069abc88d\") " pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.656960 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f925eeb5-fe5d-4479-9c06-be3069abc88d-webhook-cert\") pod \"metallb-operator-webhook-server-7c77dc94b4-h9wms\" (UID: \"f925eeb5-fe5d-4479-9c06-be3069abc88d\") " pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.668031 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nbxz\" (UniqueName: \"kubernetes.io/projected/f925eeb5-fe5d-4479-9c06-be3069abc88d-kube-api-access-9nbxz\") pod \"metallb-operator-webhook-server-7c77dc94b4-h9wms\" (UID: \"f925eeb5-fe5d-4479-9c06-be3069abc88d\") " pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:19 crc kubenswrapper[4845]: I1006 06:56:19.753313 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:20 crc kubenswrapper[4845]: I1006 06:56:20.127527 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms"] Oct 06 06:56:20 crc kubenswrapper[4845]: I1006 06:56:20.562288 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" event={"ID":"f925eeb5-fe5d-4479-9c06-be3069abc88d","Type":"ContainerStarted","Data":"12cf9f3080e1b8146b317bab9c82fde6298427280923ed718fd7bed36f339f10"} Oct 06 06:56:23 crc kubenswrapper[4845]: I1006 06:56:23.584576 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" event={"ID":"5501cc41-c16f-423f-a782-96e0d186f44e","Type":"ContainerStarted","Data":"0a6fc395bd2705d2a4b9eebbfc187caa97289abd76cca89c2a1c29bb3c8c27f8"} Oct 06 06:56:23 crc kubenswrapper[4845]: I1006 06:56:23.615054 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" podStartSLOduration=2.528460228 podStartE2EDuration="5.615027725s" podCreationTimestamp="2025-10-06 06:56:18 +0000 UTC" firstStartedPulling="2025-10-06 06:56:19.473286937 +0000 UTC m=+663.988027945" lastFinishedPulling="2025-10-06 06:56:22.559854444 +0000 UTC m=+667.074595442" observedRunningTime="2025-10-06 06:56:23.609133701 +0000 UTC m=+668.123874709" watchObservedRunningTime="2025-10-06 06:56:23.615027725 +0000 UTC m=+668.129768733" Oct 06 06:56:24 crc kubenswrapper[4845]: I1006 06:56:24.589925 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:25 crc kubenswrapper[4845]: I1006 06:56:25.596567 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" event={"ID":"f925eeb5-fe5d-4479-9c06-be3069abc88d","Type":"ContainerStarted","Data":"eb2ec1f0f294728a98fce7ab439059a036a3b52c4dc1afe4e78f1e69df56dac7"} Oct 06 06:56:25 crc kubenswrapper[4845]: I1006 06:56:25.597124 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:25 crc kubenswrapper[4845]: I1006 06:56:25.615731 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" podStartSLOduration=1.863841282 podStartE2EDuration="6.615713608s" podCreationTimestamp="2025-10-06 06:56:19 +0000 UTC" firstStartedPulling="2025-10-06 06:56:20.133832853 +0000 UTC m=+664.648573861" lastFinishedPulling="2025-10-06 06:56:24.885705179 +0000 UTC m=+669.400446187" observedRunningTime="2025-10-06 06:56:25.613128214 +0000 UTC m=+670.127869222" watchObservedRunningTime="2025-10-06 06:56:25.615713608 +0000 UTC m=+670.130454616" Oct 06 06:56:39 crc kubenswrapper[4845]: I1006 06:56:39.758051 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7c77dc94b4-h9wms" Oct 06 06:56:53 crc kubenswrapper[4845]: I1006 06:56:53.019395 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:56:53 crc kubenswrapper[4845]: I1006 06:56:53.019978 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.204367 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-687d5696cb-xscrv" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.891165 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-dpj7m"] Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.893890 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.895686 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4d59j" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.896117 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.897046 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.906349 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-kctns"] Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.907626 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.910762 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.924217 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-kctns"] Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.989466 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tsmdg"] Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.990326 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tsmdg" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.990946 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-frr-startup\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.991067 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-reloader\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.991146 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/396607a8-6c19-447d-a4a2-7a8ef92d957a-cert\") pod \"frr-k8s-webhook-server-64bf5d555-kctns\" (UID: \"396607a8-6c19-447d-a4a2-7a8ef92d957a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.991232 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-metrics-certs\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.991308 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-metrics\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.991422 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-frr-conf\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.991475 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-frr-sockets\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.991527 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd2zw\" (UniqueName: \"kubernetes.io/projected/396607a8-6c19-447d-a4a2-7a8ef92d957a-kube-api-access-xd2zw\") pod \"frr-k8s-webhook-server-64bf5d555-kctns\" (UID: \"396607a8-6c19-447d-a4a2-7a8ef92d957a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.991586 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6bd9\" (UniqueName: \"kubernetes.io/projected/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-kube-api-access-j6bd9\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.992269 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.992349 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.998210 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-8dgmv" Oct 06 06:56:59 crc kubenswrapper[4845]: I1006 06:56:59.999774 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-xbpss"] Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.001218 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.002508 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.004550 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.009658 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-xbpss"] Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093201 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd2zw\" (UniqueName: \"kubernetes.io/projected/396607a8-6c19-447d-a4a2-7a8ef92d957a-kube-api-access-xd2zw\") pod \"frr-k8s-webhook-server-64bf5d555-kctns\" (UID: \"396607a8-6c19-447d-a4a2-7a8ef92d957a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093257 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcf3c696-ca9a-4b92-9314-dcc4edb86577-metrics-certs\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093281 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fcf3c696-ca9a-4b92-9314-dcc4edb86577-metallb-excludel2\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093302 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpcwk\" (UniqueName: \"kubernetes.io/projected/96184c2a-b2b5-4dec-b93b-a875c0a07930-kube-api-access-rpcwk\") pod \"controller-68d546b9d8-xbpss\" (UID: \"96184c2a-b2b5-4dec-b93b-a875c0a07930\") " pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093331 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96184c2a-b2b5-4dec-b93b-a875c0a07930-cert\") pod \"controller-68d546b9d8-xbpss\" (UID: \"96184c2a-b2b5-4dec-b93b-a875c0a07930\") " pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093351 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6bd9\" (UniqueName: \"kubernetes.io/projected/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-kube-api-access-j6bd9\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093403 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-frr-startup\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093430 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-reloader\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093449 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/396607a8-6c19-447d-a4a2-7a8ef92d957a-cert\") pod \"frr-k8s-webhook-server-64bf5d555-kctns\" (UID: \"396607a8-6c19-447d-a4a2-7a8ef92d957a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093468 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fcf3c696-ca9a-4b92-9314-dcc4edb86577-memberlist\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093485 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-metrics-certs\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093501 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-metrics\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093518 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-frr-conf\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093537 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96184c2a-b2b5-4dec-b93b-a875c0a07930-metrics-certs\") pod \"controller-68d546b9d8-xbpss\" (UID: \"96184c2a-b2b5-4dec-b93b-a875c0a07930\") " pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093554 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w5gh\" (UniqueName: \"kubernetes.io/projected/fcf3c696-ca9a-4b92-9314-dcc4edb86577-kube-api-access-7w5gh\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093571 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-frr-sockets\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.093954 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-frr-sockets\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.094175 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-metrics\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.094414 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-frr-conf\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: E1006 06:57:00.094496 4845 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.094594 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-reloader\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: E1006 06:57:00.094597 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/396607a8-6c19-447d-a4a2-7a8ef92d957a-cert podName:396607a8-6c19-447d-a4a2-7a8ef92d957a nodeName:}" failed. No retries permitted until 2025-10-06 06:57:00.594577161 +0000 UTC m=+705.109318159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/396607a8-6c19-447d-a4a2-7a8ef92d957a-cert") pod "frr-k8s-webhook-server-64bf5d555-kctns" (UID: "396607a8-6c19-447d-a4a2-7a8ef92d957a") : secret "frr-k8s-webhook-server-cert" not found Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.094496 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-frr-startup\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.099847 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-metrics-certs\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.107100 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd2zw\" (UniqueName: \"kubernetes.io/projected/396607a8-6c19-447d-a4a2-7a8ef92d957a-kube-api-access-xd2zw\") pod \"frr-k8s-webhook-server-64bf5d555-kctns\" (UID: \"396607a8-6c19-447d-a4a2-7a8ef92d957a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.111549 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6bd9\" (UniqueName: \"kubernetes.io/projected/ff6e65ed-e4d3-4fce-b1c9-87eb219c2924-kube-api-access-j6bd9\") pod \"frr-k8s-dpj7m\" (UID: \"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924\") " pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.195281 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fcf3c696-ca9a-4b92-9314-dcc4edb86577-memberlist\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.195332 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96184c2a-b2b5-4dec-b93b-a875c0a07930-metrics-certs\") pod \"controller-68d546b9d8-xbpss\" (UID: \"96184c2a-b2b5-4dec-b93b-a875c0a07930\") " pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.195353 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w5gh\" (UniqueName: \"kubernetes.io/projected/fcf3c696-ca9a-4b92-9314-dcc4edb86577-kube-api-access-7w5gh\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.195436 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcf3c696-ca9a-4b92-9314-dcc4edb86577-metrics-certs\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:00 crc kubenswrapper[4845]: E1006 06:57:00.195444 4845 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 06:57:00 crc kubenswrapper[4845]: E1006 06:57:00.195522 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcf3c696-ca9a-4b92-9314-dcc4edb86577-memberlist podName:fcf3c696-ca9a-4b92-9314-dcc4edb86577 nodeName:}" failed. No retries permitted until 2025-10-06 06:57:00.695498365 +0000 UTC m=+705.210239473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/fcf3c696-ca9a-4b92-9314-dcc4edb86577-memberlist") pod "speaker-tsmdg" (UID: "fcf3c696-ca9a-4b92-9314-dcc4edb86577") : secret "metallb-memberlist" not found Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.195457 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fcf3c696-ca9a-4b92-9314-dcc4edb86577-metallb-excludel2\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.195823 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpcwk\" (UniqueName: \"kubernetes.io/projected/96184c2a-b2b5-4dec-b93b-a875c0a07930-kube-api-access-rpcwk\") pod \"controller-68d546b9d8-xbpss\" (UID: \"96184c2a-b2b5-4dec-b93b-a875c0a07930\") " pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.195939 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96184c2a-b2b5-4dec-b93b-a875c0a07930-cert\") pod \"controller-68d546b9d8-xbpss\" (UID: \"96184c2a-b2b5-4dec-b93b-a875c0a07930\") " pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.196075 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fcf3c696-ca9a-4b92-9314-dcc4edb86577-metallb-excludel2\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.201899 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcf3c696-ca9a-4b92-9314-dcc4edb86577-metrics-certs\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.202094 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96184c2a-b2b5-4dec-b93b-a875c0a07930-metrics-certs\") pod \"controller-68d546b9d8-xbpss\" (UID: \"96184c2a-b2b5-4dec-b93b-a875c0a07930\") " pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.202572 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96184c2a-b2b5-4dec-b93b-a875c0a07930-cert\") pod \"controller-68d546b9d8-xbpss\" (UID: \"96184c2a-b2b5-4dec-b93b-a875c0a07930\") " pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.212696 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.217410 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpcwk\" (UniqueName: \"kubernetes.io/projected/96184c2a-b2b5-4dec-b93b-a875c0a07930-kube-api-access-rpcwk\") pod \"controller-68d546b9d8-xbpss\" (UID: \"96184c2a-b2b5-4dec-b93b-a875c0a07930\") " pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.219163 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w5gh\" (UniqueName: \"kubernetes.io/projected/fcf3c696-ca9a-4b92-9314-dcc4edb86577-kube-api-access-7w5gh\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.321938 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.574252 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-xbpss"] Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.601828 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/396607a8-6c19-447d-a4a2-7a8ef92d957a-cert\") pod \"frr-k8s-webhook-server-64bf5d555-kctns\" (UID: \"396607a8-6c19-447d-a4a2-7a8ef92d957a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.607969 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/396607a8-6c19-447d-a4a2-7a8ef92d957a-cert\") pod \"frr-k8s-webhook-server-64bf5d555-kctns\" (UID: \"396607a8-6c19-447d-a4a2-7a8ef92d957a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.703623 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fcf3c696-ca9a-4b92-9314-dcc4edb86577-memberlist\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:00 crc kubenswrapper[4845]: E1006 06:57:00.703926 4845 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 06:57:00 crc kubenswrapper[4845]: E1006 06:57:00.704004 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcf3c696-ca9a-4b92-9314-dcc4edb86577-memberlist podName:fcf3c696-ca9a-4b92-9314-dcc4edb86577 nodeName:}" failed. No retries permitted until 2025-10-06 06:57:01.703987241 +0000 UTC m=+706.218728249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/fcf3c696-ca9a-4b92-9314-dcc4edb86577-memberlist") pod "speaker-tsmdg" (UID: "fcf3c696-ca9a-4b92-9314-dcc4edb86577") : secret "metallb-memberlist" not found Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.762931 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dpj7m" event={"ID":"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924","Type":"ContainerStarted","Data":"846c28740d14a3bdeccb33a5b0dba6929a41d5f812ba9781f1af4b84f3c73583"} Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.764767 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-xbpss" event={"ID":"96184c2a-b2b5-4dec-b93b-a875c0a07930","Type":"ContainerStarted","Data":"0fc22af0e4c88fc65f360767492fb7669472ac1a6b35aca85fd215268409edd3"} Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.764788 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-xbpss" event={"ID":"96184c2a-b2b5-4dec-b93b-a875c0a07930","Type":"ContainerStarted","Data":"24c320f33e09428ea28c3efbf0e14b48eee120b055a441d01d9cb26153ad19b7"} Oct 06 06:57:00 crc kubenswrapper[4845]: I1006 06:57:00.828835 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" Oct 06 06:57:01 crc kubenswrapper[4845]: I1006 06:57:01.196062 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-kctns"] Oct 06 06:57:01 crc kubenswrapper[4845]: I1006 06:57:01.714908 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fcf3c696-ca9a-4b92-9314-dcc4edb86577-memberlist\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:01 crc kubenswrapper[4845]: I1006 06:57:01.719550 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fcf3c696-ca9a-4b92-9314-dcc4edb86577-memberlist\") pod \"speaker-tsmdg\" (UID: \"fcf3c696-ca9a-4b92-9314-dcc4edb86577\") " pod="metallb-system/speaker-tsmdg" Oct 06 06:57:01 crc kubenswrapper[4845]: I1006 06:57:01.771096 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" event={"ID":"396607a8-6c19-447d-a4a2-7a8ef92d957a","Type":"ContainerStarted","Data":"ed646686d5d259724752fe6e9d232fb9483adeccd3da3cf7b329bc5c3ae0f40f"} Oct 06 06:57:01 crc kubenswrapper[4845]: I1006 06:57:01.773581 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-xbpss" event={"ID":"96184c2a-b2b5-4dec-b93b-a875c0a07930","Type":"ContainerStarted","Data":"83d2ad9d4723428194126a6e7495a96c58d77e7e38582d9fc4095b16a8ed95cb"} Oct 06 06:57:01 crc kubenswrapper[4845]: I1006 06:57:01.774208 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:01 crc kubenswrapper[4845]: I1006 06:57:01.791218 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-xbpss" podStartSLOduration=2.791201138 podStartE2EDuration="2.791201138s" podCreationTimestamp="2025-10-06 06:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:57:01.788233636 +0000 UTC m=+706.302974674" watchObservedRunningTime="2025-10-06 06:57:01.791201138 +0000 UTC m=+706.305942146" Oct 06 06:57:01 crc kubenswrapper[4845]: I1006 06:57:01.804424 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tsmdg" Oct 06 06:57:02 crc kubenswrapper[4845]: I1006 06:57:02.785570 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tsmdg" event={"ID":"fcf3c696-ca9a-4b92-9314-dcc4edb86577","Type":"ContainerStarted","Data":"30893acbb4d9a3fcd557ed2b7020730b04958c240e07f08b3924c0a4069ef374"} Oct 06 06:57:02 crc kubenswrapper[4845]: I1006 06:57:02.785608 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tsmdg" event={"ID":"fcf3c696-ca9a-4b92-9314-dcc4edb86577","Type":"ContainerStarted","Data":"f9838da9fd8403d1a6b815cc7915a26ebd818742d795b04b785743e537a95315"} Oct 06 06:57:02 crc kubenswrapper[4845]: I1006 06:57:02.785619 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tsmdg" event={"ID":"fcf3c696-ca9a-4b92-9314-dcc4edb86577","Type":"ContainerStarted","Data":"95b9b9b27b462766ee48acb536d75cc8c9024b07ff9e7ed4d076dfbb88875e4c"} Oct 06 06:57:02 crc kubenswrapper[4845]: I1006 06:57:02.786113 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tsmdg" Oct 06 06:57:02 crc kubenswrapper[4845]: I1006 06:57:02.817323 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tsmdg" podStartSLOduration=3.817302396 podStartE2EDuration="3.817302396s" podCreationTimestamp="2025-10-06 06:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:57:02.813844932 +0000 UTC m=+707.328585960" watchObservedRunningTime="2025-10-06 06:57:02.817302396 +0000 UTC m=+707.332043404" Oct 06 06:57:07 crc kubenswrapper[4845]: I1006 06:57:07.818911 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" event={"ID":"396607a8-6c19-447d-a4a2-7a8ef92d957a","Type":"ContainerStarted","Data":"c90e30f7a5701d1c1f7f45fa3c3dc6a0fdeae7808d871b59e96e6eb68034cbc9"} Oct 06 06:57:07 crc kubenswrapper[4845]: I1006 06:57:07.819386 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" Oct 06 06:57:07 crc kubenswrapper[4845]: I1006 06:57:07.821980 4845 generic.go:334] "Generic (PLEG): container finished" podID="ff6e65ed-e4d3-4fce-b1c9-87eb219c2924" containerID="f964234d0e1d766f625b06bd591e1a9759e5d2b987be6195f3f81555fd798fc3" exitCode=0 Oct 06 06:57:07 crc kubenswrapper[4845]: I1006 06:57:07.822038 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dpj7m" event={"ID":"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924","Type":"ContainerDied","Data":"f964234d0e1d766f625b06bd591e1a9759e5d2b987be6195f3f81555fd798fc3"} Oct 06 06:57:07 crc kubenswrapper[4845]: I1006 06:57:07.842355 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" podStartSLOduration=2.6388740630000003 podStartE2EDuration="8.84233505s" podCreationTimestamp="2025-10-06 06:56:59 +0000 UTC" firstStartedPulling="2025-10-06 06:57:01.210183223 +0000 UTC m=+705.724924251" lastFinishedPulling="2025-10-06 06:57:07.41364421 +0000 UTC m=+711.928385238" observedRunningTime="2025-10-06 06:57:07.836084067 +0000 UTC m=+712.350825085" watchObservedRunningTime="2025-10-06 06:57:07.84233505 +0000 UTC m=+712.357076068" Oct 06 06:57:08 crc kubenswrapper[4845]: I1006 06:57:08.829768 4845 generic.go:334] "Generic (PLEG): container finished" podID="ff6e65ed-e4d3-4fce-b1c9-87eb219c2924" containerID="aa0ea41409dc11672b5e515dd954f229140cc95233d0f349715365367936a784" exitCode=0 Oct 06 06:57:08 crc kubenswrapper[4845]: I1006 06:57:08.830591 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dpj7m" event={"ID":"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924","Type":"ContainerDied","Data":"aa0ea41409dc11672b5e515dd954f229140cc95233d0f349715365367936a784"} Oct 06 06:57:09 crc kubenswrapper[4845]: I1006 06:57:09.840006 4845 generic.go:334] "Generic (PLEG): container finished" podID="ff6e65ed-e4d3-4fce-b1c9-87eb219c2924" containerID="65e477615d8089a98769ac668147778997f0466bde65846775c00988e7d164f5" exitCode=0 Oct 06 06:57:09 crc kubenswrapper[4845]: I1006 06:57:09.840062 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dpj7m" event={"ID":"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924","Type":"ContainerDied","Data":"65e477615d8089a98769ac668147778997f0466bde65846775c00988e7d164f5"} Oct 06 06:57:10 crc kubenswrapper[4845]: I1006 06:57:10.326553 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-xbpss" Oct 06 06:57:10 crc kubenswrapper[4845]: I1006 06:57:10.852596 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dpj7m" event={"ID":"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924","Type":"ContainerStarted","Data":"7ff37799c4438fd6276060e925fc01518b43657795225a25929bdf85b3672f79"} Oct 06 06:57:10 crc kubenswrapper[4845]: I1006 06:57:10.852645 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dpj7m" event={"ID":"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924","Type":"ContainerStarted","Data":"19a7c05c17cc08ff21b0cde515c8f8cb73467e3c88d857631e8e6fb468e57c8c"} Oct 06 06:57:10 crc kubenswrapper[4845]: I1006 06:57:10.852660 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dpj7m" event={"ID":"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924","Type":"ContainerStarted","Data":"d05c1a98d8c52fcab44e9483a6b698075e6a1807a61ce35909fbd7183b06daa9"} Oct 06 06:57:10 crc kubenswrapper[4845]: I1006 06:57:10.852672 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dpj7m" event={"ID":"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924","Type":"ContainerStarted","Data":"b07cb350f5dac1d4b8d03fd0c77b4058537a6ff9d274623fed9776feb6996288"} Oct 06 06:57:10 crc kubenswrapper[4845]: I1006 06:57:10.852684 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dpj7m" event={"ID":"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924","Type":"ContainerStarted","Data":"c294caadde9561c5a074b879019e17859e54d8650fb763aa1279156733d4ed93"} Oct 06 06:57:11 crc kubenswrapper[4845]: I1006 06:57:11.861904 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dpj7m" event={"ID":"ff6e65ed-e4d3-4fce-b1c9-87eb219c2924","Type":"ContainerStarted","Data":"944b4d5642b226261e608c86b11fd74d3632e14cf2d0a53da6512aedae50367f"} Oct 06 06:57:11 crc kubenswrapper[4845]: I1006 06:57:11.862309 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:11 crc kubenswrapper[4845]: I1006 06:57:11.893673 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-dpj7m" podStartSLOduration=5.805457922 podStartE2EDuration="12.893648921s" podCreationTimestamp="2025-10-06 06:56:59 +0000 UTC" firstStartedPulling="2025-10-06 06:57:00.342791806 +0000 UTC m=+704.857532824" lastFinishedPulling="2025-10-06 06:57:07.430982815 +0000 UTC m=+711.945723823" observedRunningTime="2025-10-06 06:57:11.890007289 +0000 UTC m=+716.404748317" watchObservedRunningTime="2025-10-06 06:57:11.893648921 +0000 UTC m=+716.408389929" Oct 06 06:57:15 crc kubenswrapper[4845]: I1006 06:57:15.213727 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:15 crc kubenswrapper[4845]: I1006 06:57:15.264318 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:20 crc kubenswrapper[4845]: I1006 06:57:20.217034 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-dpj7m" Oct 06 06:57:20 crc kubenswrapper[4845]: I1006 06:57:20.833806 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-kctns" Oct 06 06:57:21 crc kubenswrapper[4845]: I1006 06:57:21.810338 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tsmdg" Oct 06 06:57:23 crc kubenswrapper[4845]: I1006 06:57:23.019261 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:57:23 crc kubenswrapper[4845]: I1006 06:57:23.020726 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:57:24 crc kubenswrapper[4845]: I1006 06:57:24.717646 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kk2sz"] Oct 06 06:57:24 crc kubenswrapper[4845]: I1006 06:57:24.719530 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kk2sz" Oct 06 06:57:24 crc kubenswrapper[4845]: I1006 06:57:24.722084 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 06 06:57:24 crc kubenswrapper[4845]: I1006 06:57:24.722106 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6fs2f" Oct 06 06:57:24 crc kubenswrapper[4845]: I1006 06:57:24.722231 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 06 06:57:24 crc kubenswrapper[4845]: I1006 06:57:24.737206 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kk2sz"] Oct 06 06:57:24 crc kubenswrapper[4845]: I1006 06:57:24.851701 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmdvq\" (UniqueName: \"kubernetes.io/projected/eb18cb7b-f08f-4d76-9aaa-150bfac996d8-kube-api-access-mmdvq\") pod \"openstack-operator-index-kk2sz\" (UID: \"eb18cb7b-f08f-4d76-9aaa-150bfac996d8\") " pod="openstack-operators/openstack-operator-index-kk2sz" Oct 06 06:57:24 crc kubenswrapper[4845]: I1006 06:57:24.953526 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmdvq\" (UniqueName: \"kubernetes.io/projected/eb18cb7b-f08f-4d76-9aaa-150bfac996d8-kube-api-access-mmdvq\") pod \"openstack-operator-index-kk2sz\" (UID: \"eb18cb7b-f08f-4d76-9aaa-150bfac996d8\") " pod="openstack-operators/openstack-operator-index-kk2sz" Oct 06 06:57:24 crc kubenswrapper[4845]: I1006 06:57:24.976392 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmdvq\" (UniqueName: \"kubernetes.io/projected/eb18cb7b-f08f-4d76-9aaa-150bfac996d8-kube-api-access-mmdvq\") pod \"openstack-operator-index-kk2sz\" (UID: \"eb18cb7b-f08f-4d76-9aaa-150bfac996d8\") " pod="openstack-operators/openstack-operator-index-kk2sz" Oct 06 06:57:25 crc kubenswrapper[4845]: I1006 06:57:25.045435 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kk2sz" Oct 06 06:57:25 crc kubenswrapper[4845]: I1006 06:57:25.435974 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kk2sz"] Oct 06 06:57:25 crc kubenswrapper[4845]: I1006 06:57:25.965724 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kk2sz" event={"ID":"eb18cb7b-f08f-4d76-9aaa-150bfac996d8","Type":"ContainerStarted","Data":"cdc4663a31c50645dc80eec0a848cc2425fc6f90cb15e5b6095886b82fae09b9"} Oct 06 06:57:26 crc kubenswrapper[4845]: I1006 06:57:26.974575 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kk2sz" event={"ID":"eb18cb7b-f08f-4d76-9aaa-150bfac996d8","Type":"ContainerStarted","Data":"4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63"} Oct 06 06:57:26 crc kubenswrapper[4845]: I1006 06:57:26.994251 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kk2sz" podStartSLOduration=2.260261592 podStartE2EDuration="2.994233699s" podCreationTimestamp="2025-10-06 06:57:24 +0000 UTC" firstStartedPulling="2025-10-06 06:57:25.442215599 +0000 UTC m=+729.956956597" lastFinishedPulling="2025-10-06 06:57:26.176187696 +0000 UTC m=+730.690928704" observedRunningTime="2025-10-06 06:57:26.990879694 +0000 UTC m=+731.505620752" watchObservedRunningTime="2025-10-06 06:57:26.994233699 +0000 UTC m=+731.508974707" Oct 06 06:57:28 crc kubenswrapper[4845]: I1006 06:57:28.104213 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kk2sz"] Oct 06 06:57:28 crc kubenswrapper[4845]: I1006 06:57:28.702480 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-k8dzl"] Oct 06 06:57:28 crc kubenswrapper[4845]: I1006 06:57:28.703134 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k8dzl" Oct 06 06:57:28 crc kubenswrapper[4845]: I1006 06:57:28.710974 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k8dzl"] Oct 06 06:57:28 crc kubenswrapper[4845]: I1006 06:57:28.808890 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld475\" (UniqueName: \"kubernetes.io/projected/6cbeaf69-7c1e-472a-8383-209ac778658e-kube-api-access-ld475\") pod \"openstack-operator-index-k8dzl\" (UID: \"6cbeaf69-7c1e-472a-8383-209ac778658e\") " pod="openstack-operators/openstack-operator-index-k8dzl" Oct 06 06:57:28 crc kubenswrapper[4845]: I1006 06:57:28.910395 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld475\" (UniqueName: \"kubernetes.io/projected/6cbeaf69-7c1e-472a-8383-209ac778658e-kube-api-access-ld475\") pod \"openstack-operator-index-k8dzl\" (UID: \"6cbeaf69-7c1e-472a-8383-209ac778658e\") " pod="openstack-operators/openstack-operator-index-k8dzl" Oct 06 06:57:28 crc kubenswrapper[4845]: I1006 06:57:28.937875 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld475\" (UniqueName: \"kubernetes.io/projected/6cbeaf69-7c1e-472a-8383-209ac778658e-kube-api-access-ld475\") pod \"openstack-operator-index-k8dzl\" (UID: \"6cbeaf69-7c1e-472a-8383-209ac778658e\") " pod="openstack-operators/openstack-operator-index-k8dzl" Oct 06 06:57:28 crc kubenswrapper[4845]: I1006 06:57:28.985240 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-kk2sz" podUID="eb18cb7b-f08f-4d76-9aaa-150bfac996d8" containerName="registry-server" containerID="cri-o://4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63" gracePeriod=2 Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.019214 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k8dzl" Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.219999 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k8dzl"] Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.343088 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kk2sz" Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.419352 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmdvq\" (UniqueName: \"kubernetes.io/projected/eb18cb7b-f08f-4d76-9aaa-150bfac996d8-kube-api-access-mmdvq\") pod \"eb18cb7b-f08f-4d76-9aaa-150bfac996d8\" (UID: \"eb18cb7b-f08f-4d76-9aaa-150bfac996d8\") " Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.426312 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb18cb7b-f08f-4d76-9aaa-150bfac996d8-kube-api-access-mmdvq" (OuterVolumeSpecName: "kube-api-access-mmdvq") pod "eb18cb7b-f08f-4d76-9aaa-150bfac996d8" (UID: "eb18cb7b-f08f-4d76-9aaa-150bfac996d8"). InnerVolumeSpecName "kube-api-access-mmdvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.521091 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmdvq\" (UniqueName: \"kubernetes.io/projected/eb18cb7b-f08f-4d76-9aaa-150bfac996d8-kube-api-access-mmdvq\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.992319 4845 generic.go:334] "Generic (PLEG): container finished" podID="eb18cb7b-f08f-4d76-9aaa-150bfac996d8" containerID="4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63" exitCode=0 Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.992418 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kk2sz" event={"ID":"eb18cb7b-f08f-4d76-9aaa-150bfac996d8","Type":"ContainerDied","Data":"4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63"} Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.992725 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kk2sz" event={"ID":"eb18cb7b-f08f-4d76-9aaa-150bfac996d8","Type":"ContainerDied","Data":"cdc4663a31c50645dc80eec0a848cc2425fc6f90cb15e5b6095886b82fae09b9"} Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.992743 4845 scope.go:117] "RemoveContainer" containerID="4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63" Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.992458 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kk2sz" Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.994338 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k8dzl" event={"ID":"6cbeaf69-7c1e-472a-8383-209ac778658e","Type":"ContainerStarted","Data":"6c827d0d06cb6241f67e1d2bdb85198dbc89bd90071049e6fb6583d8941e0a68"} Oct 06 06:57:29 crc kubenswrapper[4845]: I1006 06:57:29.994438 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k8dzl" event={"ID":"6cbeaf69-7c1e-472a-8383-209ac778658e","Type":"ContainerStarted","Data":"9e975d5b50abf511c053f16169d37ff49e2a11cf5e851c98f1c3e024b6244788"} Oct 06 06:57:30 crc kubenswrapper[4845]: I1006 06:57:30.009866 4845 scope.go:117] "RemoveContainer" containerID="4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63" Oct 06 06:57:30 crc kubenswrapper[4845]: E1006 06:57:30.010196 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63\": container with ID starting with 4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63 not found: ID does not exist" containerID="4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63" Oct 06 06:57:30 crc kubenswrapper[4845]: I1006 06:57:30.010230 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63"} err="failed to get container status \"4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63\": rpc error: code = NotFound desc = could not find container \"4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63\": container with ID starting with 4341594aeadd12cc3e41b1993442badb114687473e271df5c05bff59c8b56d63 not found: ID does not exist" Oct 06 06:57:30 crc kubenswrapper[4845]: I1006 06:57:30.014755 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-k8dzl" podStartSLOduration=1.625456313 podStartE2EDuration="2.014745386s" podCreationTimestamp="2025-10-06 06:57:28 +0000 UTC" firstStartedPulling="2025-10-06 06:57:29.24483864 +0000 UTC m=+733.759579648" lastFinishedPulling="2025-10-06 06:57:29.634127683 +0000 UTC m=+734.148868721" observedRunningTime="2025-10-06 06:57:30.012965831 +0000 UTC m=+734.527706849" watchObservedRunningTime="2025-10-06 06:57:30.014745386 +0000 UTC m=+734.529486394" Oct 06 06:57:30 crc kubenswrapper[4845]: I1006 06:57:30.028695 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kk2sz"] Oct 06 06:57:30 crc kubenswrapper[4845]: I1006 06:57:30.031437 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-kk2sz"] Oct 06 06:57:30 crc kubenswrapper[4845]: I1006 06:57:30.235407 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb18cb7b-f08f-4d76-9aaa-150bfac996d8" path="/var/lib/kubelet/pods/eb18cb7b-f08f-4d76-9aaa-150bfac996d8/volumes" Oct 06 06:57:39 crc kubenswrapper[4845]: I1006 06:57:39.021001 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-k8dzl" Oct 06 06:57:39 crc kubenswrapper[4845]: I1006 06:57:39.021661 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-k8dzl" Oct 06 06:57:39 crc kubenswrapper[4845]: I1006 06:57:39.061187 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-k8dzl" Oct 06 06:57:39 crc kubenswrapper[4845]: I1006 06:57:39.091822 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-k8dzl" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.437053 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-w8s9q"] Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.437561 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" podUID="25e6cec0-1b44-4424-b931-81b30b582922" containerName="controller-manager" containerID="cri-o://c27db7c8ea53c1be7bba6c9010ff11ac7b78de0d190861cd810206dcd771e885" gracePeriod=30 Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.524515 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq"] Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.524734 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" podUID="b245e98d-5e97-4ab4-b35c-044899fab150" containerName="route-controller-manager" containerID="cri-o://21bf98ab32c0bea0c0e265ed75812811062db9c4306106380fa30420bc47f0ba" gracePeriod=30 Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.542858 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl"] Oct 06 06:57:41 crc kubenswrapper[4845]: E1006 06:57:41.543105 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb18cb7b-f08f-4d76-9aaa-150bfac996d8" containerName="registry-server" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.543117 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb18cb7b-f08f-4d76-9aaa-150bfac996d8" containerName="registry-server" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.543216 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb18cb7b-f08f-4d76-9aaa-150bfac996d8" containerName="registry-server" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.543995 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.553544 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-7flbn" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.568440 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl"] Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.691885 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqh8r\" (UniqueName: \"kubernetes.io/projected/609d88e3-66d9-4f44-a539-2b6c35886f06-kube-api-access-zqh8r\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl\" (UID: \"609d88e3-66d9-4f44-a539-2b6c35886f06\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.691925 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/609d88e3-66d9-4f44-a539-2b6c35886f06-util\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl\" (UID: \"609d88e3-66d9-4f44-a539-2b6c35886f06\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.692085 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/609d88e3-66d9-4f44-a539-2b6c35886f06-bundle\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl\" (UID: \"609d88e3-66d9-4f44-a539-2b6c35886f06\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.793461 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/609d88e3-66d9-4f44-a539-2b6c35886f06-bundle\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl\" (UID: \"609d88e3-66d9-4f44-a539-2b6c35886f06\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.793923 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/609d88e3-66d9-4f44-a539-2b6c35886f06-bundle\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl\" (UID: \"609d88e3-66d9-4f44-a539-2b6c35886f06\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.794074 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqh8r\" (UniqueName: \"kubernetes.io/projected/609d88e3-66d9-4f44-a539-2b6c35886f06-kube-api-access-zqh8r\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl\" (UID: \"609d88e3-66d9-4f44-a539-2b6c35886f06\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.794126 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/609d88e3-66d9-4f44-a539-2b6c35886f06-util\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl\" (UID: \"609d88e3-66d9-4f44-a539-2b6c35886f06\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.794388 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/609d88e3-66d9-4f44-a539-2b6c35886f06-util\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl\" (UID: \"609d88e3-66d9-4f44-a539-2b6c35886f06\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.813195 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqh8r\" (UniqueName: \"kubernetes.io/projected/609d88e3-66d9-4f44-a539-2b6c35886f06-kube-api-access-zqh8r\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl\" (UID: \"609d88e3-66d9-4f44-a539-2b6c35886f06\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:41 crc kubenswrapper[4845]: I1006 06:57:41.991899 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.065138 4845 generic.go:334] "Generic (PLEG): container finished" podID="25e6cec0-1b44-4424-b931-81b30b582922" containerID="c27db7c8ea53c1be7bba6c9010ff11ac7b78de0d190861cd810206dcd771e885" exitCode=0 Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.065216 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" event={"ID":"25e6cec0-1b44-4424-b931-81b30b582922","Type":"ContainerDied","Data":"c27db7c8ea53c1be7bba6c9010ff11ac7b78de0d190861cd810206dcd771e885"} Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.066742 4845 generic.go:334] "Generic (PLEG): container finished" podID="b245e98d-5e97-4ab4-b35c-044899fab150" containerID="21bf98ab32c0bea0c0e265ed75812811062db9c4306106380fa30420bc47f0ba" exitCode=0 Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.066780 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" event={"ID":"b245e98d-5e97-4ab4-b35c-044899fab150","Type":"ContainerDied","Data":"21bf98ab32c0bea0c0e265ed75812811062db9c4306106380fa30420bc47f0ba"} Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.363506 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.392340 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.429079 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl"] Oct 06 06:57:42 crc kubenswrapper[4845]: W1006 06:57:42.437912 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod609d88e3_66d9_4f44_a539_2b6c35886f06.slice/crio-de4e6937dfaf713355858142a8431d8e7222e86770499dea4ff0b1ee1f3e6721 WatchSource:0}: Error finding container de4e6937dfaf713355858142a8431d8e7222e86770499dea4ff0b1ee1f3e6721: Status 404 returned error can't find the container with id de4e6937dfaf713355858142a8431d8e7222e86770499dea4ff0b1ee1f3e6721 Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.501995 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-config\") pod \"25e6cec0-1b44-4424-b931-81b30b582922\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.502070 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-proxy-ca-bundles\") pod \"25e6cec0-1b44-4424-b931-81b30b582922\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.502106 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b245e98d-5e97-4ab4-b35c-044899fab150-client-ca\") pod \"b245e98d-5e97-4ab4-b35c-044899fab150\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.502136 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25e6cec0-1b44-4424-b931-81b30b582922-serving-cert\") pod \"25e6cec0-1b44-4424-b931-81b30b582922\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.502175 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxdlr\" (UniqueName: \"kubernetes.io/projected/b245e98d-5e97-4ab4-b35c-044899fab150-kube-api-access-fxdlr\") pod \"b245e98d-5e97-4ab4-b35c-044899fab150\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.502207 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24vh9\" (UniqueName: \"kubernetes.io/projected/25e6cec0-1b44-4424-b931-81b30b582922-kube-api-access-24vh9\") pod \"25e6cec0-1b44-4424-b931-81b30b582922\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.502233 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b245e98d-5e97-4ab4-b35c-044899fab150-config\") pod \"b245e98d-5e97-4ab4-b35c-044899fab150\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.502882 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b245e98d-5e97-4ab4-b35c-044899fab150-client-ca" (OuterVolumeSpecName: "client-ca") pod "b245e98d-5e97-4ab4-b35c-044899fab150" (UID: "b245e98d-5e97-4ab4-b35c-044899fab150"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.502909 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "25e6cec0-1b44-4424-b931-81b30b582922" (UID: "25e6cec0-1b44-4424-b931-81b30b582922"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.502918 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b245e98d-5e97-4ab4-b35c-044899fab150-config" (OuterVolumeSpecName: "config") pod "b245e98d-5e97-4ab4-b35c-044899fab150" (UID: "b245e98d-5e97-4ab4-b35c-044899fab150"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.502921 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-client-ca" (OuterVolumeSpecName: "client-ca") pod "25e6cec0-1b44-4424-b931-81b30b582922" (UID: "25e6cec0-1b44-4424-b931-81b30b582922"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.502274 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-client-ca\") pod \"25e6cec0-1b44-4424-b931-81b30b582922\" (UID: \"25e6cec0-1b44-4424-b931-81b30b582922\") " Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.502964 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-config" (OuterVolumeSpecName: "config") pod "25e6cec0-1b44-4424-b931-81b30b582922" (UID: "25e6cec0-1b44-4424-b931-81b30b582922"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.503003 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b245e98d-5e97-4ab4-b35c-044899fab150-serving-cert\") pod \"b245e98d-5e97-4ab4-b35c-044899fab150\" (UID: \"b245e98d-5e97-4ab4-b35c-044899fab150\") " Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.503236 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b245e98d-5e97-4ab4-b35c-044899fab150-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.503254 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.503264 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.503273 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25e6cec0-1b44-4424-b931-81b30b582922-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.503282 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b245e98d-5e97-4ab4-b35c-044899fab150-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.507028 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b245e98d-5e97-4ab4-b35c-044899fab150-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b245e98d-5e97-4ab4-b35c-044899fab150" (UID: "b245e98d-5e97-4ab4-b35c-044899fab150"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.507061 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e6cec0-1b44-4424-b931-81b30b582922-kube-api-access-24vh9" (OuterVolumeSpecName: "kube-api-access-24vh9") pod "25e6cec0-1b44-4424-b931-81b30b582922" (UID: "25e6cec0-1b44-4424-b931-81b30b582922"). InnerVolumeSpecName "kube-api-access-24vh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.507200 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e6cec0-1b44-4424-b931-81b30b582922-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25e6cec0-1b44-4424-b931-81b30b582922" (UID: "25e6cec0-1b44-4424-b931-81b30b582922"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.508312 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b245e98d-5e97-4ab4-b35c-044899fab150-kube-api-access-fxdlr" (OuterVolumeSpecName: "kube-api-access-fxdlr") pod "b245e98d-5e97-4ab4-b35c-044899fab150" (UID: "b245e98d-5e97-4ab4-b35c-044899fab150"). InnerVolumeSpecName "kube-api-access-fxdlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.605362 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25e6cec0-1b44-4424-b931-81b30b582922-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.605463 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxdlr\" (UniqueName: \"kubernetes.io/projected/b245e98d-5e97-4ab4-b35c-044899fab150-kube-api-access-fxdlr\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.605478 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24vh9\" (UniqueName: \"kubernetes.io/projected/25e6cec0-1b44-4424-b931-81b30b582922-kube-api-access-24vh9\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.605490 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b245e98d-5e97-4ab4-b35c-044899fab150-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.812387 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx"] Oct 06 06:57:42 crc kubenswrapper[4845]: E1006 06:57:42.812601 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b245e98d-5e97-4ab4-b35c-044899fab150" containerName="route-controller-manager" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.812612 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b245e98d-5e97-4ab4-b35c-044899fab150" containerName="route-controller-manager" Oct 06 06:57:42 crc kubenswrapper[4845]: E1006 06:57:42.812630 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e6cec0-1b44-4424-b931-81b30b582922" containerName="controller-manager" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.812636 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e6cec0-1b44-4424-b931-81b30b582922" containerName="controller-manager" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.812731 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e6cec0-1b44-4424-b931-81b30b582922" containerName="controller-manager" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.812741 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="b245e98d-5e97-4ab4-b35c-044899fab150" containerName="route-controller-manager" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.813080 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.826698 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx"] Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.909602 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d70c8b87-6c40-4ee0-ad1c-68feecb12f80-client-ca\") pod \"route-controller-manager-79cbd5d-nkdqx\" (UID: \"d70c8b87-6c40-4ee0-ad1c-68feecb12f80\") " pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.909689 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljc4l\" (UniqueName: \"kubernetes.io/projected/d70c8b87-6c40-4ee0-ad1c-68feecb12f80-kube-api-access-ljc4l\") pod \"route-controller-manager-79cbd5d-nkdqx\" (UID: \"d70c8b87-6c40-4ee0-ad1c-68feecb12f80\") " pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.909741 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d70c8b87-6c40-4ee0-ad1c-68feecb12f80-config\") pod \"route-controller-manager-79cbd5d-nkdqx\" (UID: \"d70c8b87-6c40-4ee0-ad1c-68feecb12f80\") " pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:42 crc kubenswrapper[4845]: I1006 06:57:42.909763 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d70c8b87-6c40-4ee0-ad1c-68feecb12f80-serving-cert\") pod \"route-controller-manager-79cbd5d-nkdqx\" (UID: \"d70c8b87-6c40-4ee0-ad1c-68feecb12f80\") " pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.011354 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljc4l\" (UniqueName: \"kubernetes.io/projected/d70c8b87-6c40-4ee0-ad1c-68feecb12f80-kube-api-access-ljc4l\") pod \"route-controller-manager-79cbd5d-nkdqx\" (UID: \"d70c8b87-6c40-4ee0-ad1c-68feecb12f80\") " pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.011440 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d70c8b87-6c40-4ee0-ad1c-68feecb12f80-config\") pod \"route-controller-manager-79cbd5d-nkdqx\" (UID: \"d70c8b87-6c40-4ee0-ad1c-68feecb12f80\") " pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.011468 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d70c8b87-6c40-4ee0-ad1c-68feecb12f80-serving-cert\") pod \"route-controller-manager-79cbd5d-nkdqx\" (UID: \"d70c8b87-6c40-4ee0-ad1c-68feecb12f80\") " pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.011492 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d70c8b87-6c40-4ee0-ad1c-68feecb12f80-client-ca\") pod \"route-controller-manager-79cbd5d-nkdqx\" (UID: \"d70c8b87-6c40-4ee0-ad1c-68feecb12f80\") " pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.012505 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d70c8b87-6c40-4ee0-ad1c-68feecb12f80-client-ca\") pod \"route-controller-manager-79cbd5d-nkdqx\" (UID: \"d70c8b87-6c40-4ee0-ad1c-68feecb12f80\") " pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.012934 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d70c8b87-6c40-4ee0-ad1c-68feecb12f80-config\") pod \"route-controller-manager-79cbd5d-nkdqx\" (UID: \"d70c8b87-6c40-4ee0-ad1c-68feecb12f80\") " pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.025624 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d70c8b87-6c40-4ee0-ad1c-68feecb12f80-serving-cert\") pod \"route-controller-manager-79cbd5d-nkdqx\" (UID: \"d70c8b87-6c40-4ee0-ad1c-68feecb12f80\") " pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.026287 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljc4l\" (UniqueName: \"kubernetes.io/projected/d70c8b87-6c40-4ee0-ad1c-68feecb12f80-kube-api-access-ljc4l\") pod \"route-controller-manager-79cbd5d-nkdqx\" (UID: \"d70c8b87-6c40-4ee0-ad1c-68feecb12f80\") " pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.073766 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" event={"ID":"25e6cec0-1b44-4424-b931-81b30b582922","Type":"ContainerDied","Data":"073f197a8d337ba774cb6ffcd3448a43309c34b94068fe6b8912bedcf989fb9f"} Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.073814 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-w8s9q" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.073833 4845 scope.go:117] "RemoveContainer" containerID="c27db7c8ea53c1be7bba6c9010ff11ac7b78de0d190861cd810206dcd771e885" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.078386 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" event={"ID":"b245e98d-5e97-4ab4-b35c-044899fab150","Type":"ContainerDied","Data":"26ffde206572845ebb94da954fbeaf9daf7ba310aaeb63c9fb8e958fcb6db186"} Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.078489 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.083723 4845 generic.go:334] "Generic (PLEG): container finished" podID="609d88e3-66d9-4f44-a539-2b6c35886f06" containerID="88b151cfba497dd151bacd2310c55057ac92bd6b704cbd22802105ac5669a913" exitCode=0 Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.083795 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" event={"ID":"609d88e3-66d9-4f44-a539-2b6c35886f06","Type":"ContainerDied","Data":"88b151cfba497dd151bacd2310c55057ac92bd6b704cbd22802105ac5669a913"} Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.083820 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" event={"ID":"609d88e3-66d9-4f44-a539-2b6c35886f06","Type":"ContainerStarted","Data":"de4e6937dfaf713355858142a8431d8e7222e86770499dea4ff0b1ee1f3e6721"} Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.099647 4845 scope.go:117] "RemoveContainer" containerID="21bf98ab32c0bea0c0e265ed75812811062db9c4306106380fa30420bc47f0ba" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.115996 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-w8s9q"] Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.124729 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-w8s9q"] Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.130245 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.132620 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq"] Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.141500 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wh5lq"] Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.363388 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx"] Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.415322 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f7dc56458-4rqmp"] Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.416700 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.422242 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.427441 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7dc56458-4rqmp"] Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.427721 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.427929 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.428661 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.429187 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.429392 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.432606 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.521811 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b937ee-4cf8-447f-8a1b-d7c080bc504d-config\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.521989 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39b937ee-4cf8-447f-8a1b-d7c080bc504d-client-ca\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.522037 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqj9r\" (UniqueName: \"kubernetes.io/projected/39b937ee-4cf8-447f-8a1b-d7c080bc504d-kube-api-access-bqj9r\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.522208 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39b937ee-4cf8-447f-8a1b-d7c080bc504d-serving-cert\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.522262 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39b937ee-4cf8-447f-8a1b-d7c080bc504d-proxy-ca-bundles\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.623687 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqj9r\" (UniqueName: \"kubernetes.io/projected/39b937ee-4cf8-447f-8a1b-d7c080bc504d-kube-api-access-bqj9r\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.623804 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39b937ee-4cf8-447f-8a1b-d7c080bc504d-serving-cert\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.623833 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39b937ee-4cf8-447f-8a1b-d7c080bc504d-proxy-ca-bundles\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.623907 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b937ee-4cf8-447f-8a1b-d7c080bc504d-config\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.623973 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39b937ee-4cf8-447f-8a1b-d7c080bc504d-client-ca\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.625465 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39b937ee-4cf8-447f-8a1b-d7c080bc504d-proxy-ca-bundles\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.625571 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39b937ee-4cf8-447f-8a1b-d7c080bc504d-client-ca\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.626096 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b937ee-4cf8-447f-8a1b-d7c080bc504d-config\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.636880 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39b937ee-4cf8-447f-8a1b-d7c080bc504d-serving-cert\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.650766 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqj9r\" (UniqueName: \"kubernetes.io/projected/39b937ee-4cf8-447f-8a1b-d7c080bc504d-kube-api-access-bqj9r\") pod \"controller-manager-7f7dc56458-4rqmp\" (UID: \"39b937ee-4cf8-447f-8a1b-d7c080bc504d\") " pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:43 crc kubenswrapper[4845]: I1006 06:57:43.774835 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:44 crc kubenswrapper[4845]: I1006 06:57:44.092814 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" event={"ID":"d70c8b87-6c40-4ee0-ad1c-68feecb12f80","Type":"ContainerStarted","Data":"3c42d46167201bc350da419a781e8231fe418b3749d633b06ce706f4a7a2e5b6"} Oct 06 06:57:44 crc kubenswrapper[4845]: I1006 06:57:44.093453 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" event={"ID":"d70c8b87-6c40-4ee0-ad1c-68feecb12f80","Type":"ContainerStarted","Data":"727f094915023db4642825d955c652aed0d62d510f041a1293c121024ed64de7"} Oct 06 06:57:44 crc kubenswrapper[4845]: I1006 06:57:44.093849 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:44 crc kubenswrapper[4845]: I1006 06:57:44.103874 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" Oct 06 06:57:44 crc kubenswrapper[4845]: I1006 06:57:44.109845 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79cbd5d-nkdqx" podStartSLOduration=2.109827093 podStartE2EDuration="2.109827093s" podCreationTimestamp="2025-10-06 06:57:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:57:44.109829233 +0000 UTC m=+748.624570241" watchObservedRunningTime="2025-10-06 06:57:44.109827093 +0000 UTC m=+748.624568101" Oct 06 06:57:44 crc kubenswrapper[4845]: I1006 06:57:44.235886 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e6cec0-1b44-4424-b931-81b30b582922" path="/var/lib/kubelet/pods/25e6cec0-1b44-4424-b931-81b30b582922/volumes" Oct 06 06:57:44 crc kubenswrapper[4845]: I1006 06:57:44.236840 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b245e98d-5e97-4ab4-b35c-044899fab150" path="/var/lib/kubelet/pods/b245e98d-5e97-4ab4-b35c-044899fab150/volumes" Oct 06 06:57:44 crc kubenswrapper[4845]: I1006 06:57:44.244787 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7dc56458-4rqmp"] Oct 06 06:57:44 crc kubenswrapper[4845]: W1006 06:57:44.253228 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b937ee_4cf8_447f_8a1b_d7c080bc504d.slice/crio-ed7846610a6fc2ed26cfddc9c221c71be4a8019629cbc063aad31ed4ba66679a WatchSource:0}: Error finding container ed7846610a6fc2ed26cfddc9c221c71be4a8019629cbc063aad31ed4ba66679a: Status 404 returned error can't find the container with id ed7846610a6fc2ed26cfddc9c221c71be4a8019629cbc063aad31ed4ba66679a Oct 06 06:57:45 crc kubenswrapper[4845]: I1006 06:57:45.103716 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" event={"ID":"39b937ee-4cf8-447f-8a1b-d7c080bc504d","Type":"ContainerStarted","Data":"e16798cecd0d6bd6e9b3e0652ac81db4fd14ffef4f8ab67a4d63af099bebc64a"} Oct 06 06:57:45 crc kubenswrapper[4845]: I1006 06:57:45.103771 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" event={"ID":"39b937ee-4cf8-447f-8a1b-d7c080bc504d","Type":"ContainerStarted","Data":"ed7846610a6fc2ed26cfddc9c221c71be4a8019629cbc063aad31ed4ba66679a"} Oct 06 06:57:45 crc kubenswrapper[4845]: I1006 06:57:45.106328 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:45 crc kubenswrapper[4845]: I1006 06:57:45.108726 4845 generic.go:334] "Generic (PLEG): container finished" podID="609d88e3-66d9-4f44-a539-2b6c35886f06" containerID="c43f1acd5bc2e199e0b3a61e473e7910ca59e3e4c72a47cfdc6ce4f4cc9973c3" exitCode=0 Oct 06 06:57:45 crc kubenswrapper[4845]: I1006 06:57:45.109010 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" event={"ID":"609d88e3-66d9-4f44-a539-2b6c35886f06","Type":"ContainerDied","Data":"c43f1acd5bc2e199e0b3a61e473e7910ca59e3e4c72a47cfdc6ce4f4cc9973c3"} Oct 06 06:57:45 crc kubenswrapper[4845]: I1006 06:57:45.109876 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" Oct 06 06:57:45 crc kubenswrapper[4845]: I1006 06:57:45.136421 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f7dc56458-4rqmp" podStartSLOduration=4.136404939 podStartE2EDuration="4.136404939s" podCreationTimestamp="2025-10-06 06:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:57:45.132570902 +0000 UTC m=+749.647311910" watchObservedRunningTime="2025-10-06 06:57:45.136404939 +0000 UTC m=+749.651145967" Oct 06 06:57:46 crc kubenswrapper[4845]: I1006 06:57:46.120037 4845 generic.go:334] "Generic (PLEG): container finished" podID="609d88e3-66d9-4f44-a539-2b6c35886f06" containerID="179691b0f11a8fc3d4d0783a7c63ade8b27bdc4eacfca703da8c8c0e27eee74c" exitCode=0 Oct 06 06:57:46 crc kubenswrapper[4845]: I1006 06:57:46.120087 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" event={"ID":"609d88e3-66d9-4f44-a539-2b6c35886f06","Type":"ContainerDied","Data":"179691b0f11a8fc3d4d0783a7c63ade8b27bdc4eacfca703da8c8c0e27eee74c"} Oct 06 06:57:47 crc kubenswrapper[4845]: I1006 06:57:47.429585 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:47 crc kubenswrapper[4845]: I1006 06:57:47.585061 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqh8r\" (UniqueName: \"kubernetes.io/projected/609d88e3-66d9-4f44-a539-2b6c35886f06-kube-api-access-zqh8r\") pod \"609d88e3-66d9-4f44-a539-2b6c35886f06\" (UID: \"609d88e3-66d9-4f44-a539-2b6c35886f06\") " Oct 06 06:57:47 crc kubenswrapper[4845]: I1006 06:57:47.585168 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/609d88e3-66d9-4f44-a539-2b6c35886f06-util\") pod \"609d88e3-66d9-4f44-a539-2b6c35886f06\" (UID: \"609d88e3-66d9-4f44-a539-2b6c35886f06\") " Oct 06 06:57:47 crc kubenswrapper[4845]: I1006 06:57:47.585331 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/609d88e3-66d9-4f44-a539-2b6c35886f06-bundle\") pod \"609d88e3-66d9-4f44-a539-2b6c35886f06\" (UID: \"609d88e3-66d9-4f44-a539-2b6c35886f06\") " Oct 06 06:57:47 crc kubenswrapper[4845]: I1006 06:57:47.586138 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/609d88e3-66d9-4f44-a539-2b6c35886f06-bundle" (OuterVolumeSpecName: "bundle") pod "609d88e3-66d9-4f44-a539-2b6c35886f06" (UID: "609d88e3-66d9-4f44-a539-2b6c35886f06"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:57:47 crc kubenswrapper[4845]: I1006 06:57:47.595905 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/609d88e3-66d9-4f44-a539-2b6c35886f06-kube-api-access-zqh8r" (OuterVolumeSpecName: "kube-api-access-zqh8r") pod "609d88e3-66d9-4f44-a539-2b6c35886f06" (UID: "609d88e3-66d9-4f44-a539-2b6c35886f06"). InnerVolumeSpecName "kube-api-access-zqh8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:57:47 crc kubenswrapper[4845]: I1006 06:57:47.598744 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/609d88e3-66d9-4f44-a539-2b6c35886f06-util" (OuterVolumeSpecName: "util") pod "609d88e3-66d9-4f44-a539-2b6c35886f06" (UID: "609d88e3-66d9-4f44-a539-2b6c35886f06"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:57:47 crc kubenswrapper[4845]: I1006 06:57:47.687370 4845 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/609d88e3-66d9-4f44-a539-2b6c35886f06-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:47 crc kubenswrapper[4845]: I1006 06:57:47.687455 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqh8r\" (UniqueName: \"kubernetes.io/projected/609d88e3-66d9-4f44-a539-2b6c35886f06-kube-api-access-zqh8r\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:47 crc kubenswrapper[4845]: I1006 06:57:47.687476 4845 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/609d88e3-66d9-4f44-a539-2b6c35886f06-util\") on node \"crc\" DevicePath \"\"" Oct 06 06:57:48 crc kubenswrapper[4845]: I1006 06:57:48.134332 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" event={"ID":"609d88e3-66d9-4f44-a539-2b6c35886f06","Type":"ContainerDied","Data":"de4e6937dfaf713355858142a8431d8e7222e86770499dea4ff0b1ee1f3e6721"} Oct 06 06:57:48 crc kubenswrapper[4845]: I1006 06:57:48.134400 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de4e6937dfaf713355858142a8431d8e7222e86770499dea4ff0b1ee1f3e6721" Oct 06 06:57:48 crc kubenswrapper[4845]: I1006 06:57:48.134406 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl" Oct 06 06:57:49 crc kubenswrapper[4845]: I1006 06:57:49.879419 4845 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 06:57:53 crc kubenswrapper[4845]: I1006 06:57:53.018910 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:57:53 crc kubenswrapper[4845]: I1006 06:57:53.019256 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:57:53 crc kubenswrapper[4845]: I1006 06:57:53.019303 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 06:57:53 crc kubenswrapper[4845]: I1006 06:57:53.019872 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"443ce9eb4600adb1d78ff49056676775ceac0472627215eb275bc31a89a7c3b8"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 06:57:53 crc kubenswrapper[4845]: I1006 06:57:53.019933 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://443ce9eb4600adb1d78ff49056676775ceac0472627215eb275bc31a89a7c3b8" gracePeriod=600 Oct 06 06:57:53 crc kubenswrapper[4845]: I1006 06:57:53.173180 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="443ce9eb4600adb1d78ff49056676775ceac0472627215eb275bc31a89a7c3b8" exitCode=0 Oct 06 06:57:53 crc kubenswrapper[4845]: I1006 06:57:53.173420 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"443ce9eb4600adb1d78ff49056676775ceac0472627215eb275bc31a89a7c3b8"} Oct 06 06:57:53 crc kubenswrapper[4845]: I1006 06:57:53.173506 4845 scope.go:117] "RemoveContainer" containerID="79e33f4807773eca1fba9582255263880010ec7bbf733f8b698646f795a161ec" Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.182295 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"f1a1b8d6a136dbd6653eb7b5058c9c79831c66bac509d23e8f8977bad3f0b842"} Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.332837 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp"] Oct 06 06:57:54 crc kubenswrapper[4845]: E1006 06:57:54.333111 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609d88e3-66d9-4f44-a539-2b6c35886f06" containerName="extract" Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.333130 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="609d88e3-66d9-4f44-a539-2b6c35886f06" containerName="extract" Oct 06 06:57:54 crc kubenswrapper[4845]: E1006 06:57:54.333148 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609d88e3-66d9-4f44-a539-2b6c35886f06" containerName="util" Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.333156 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="609d88e3-66d9-4f44-a539-2b6c35886f06" containerName="util" Oct 06 06:57:54 crc kubenswrapper[4845]: E1006 06:57:54.333171 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609d88e3-66d9-4f44-a539-2b6c35886f06" containerName="pull" Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.333180 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="609d88e3-66d9-4f44-a539-2b6c35886f06" containerName="pull" Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.333318 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="609d88e3-66d9-4f44-a539-2b6c35886f06" containerName="extract" Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.334024 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp" Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.335921 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-gnk9k" Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.394335 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp"] Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.475623 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8xxd\" (UniqueName: \"kubernetes.io/projected/f8c7a5a4-11d3-4d38-95d5-fb90e97378b9-kube-api-access-x8xxd\") pod \"openstack-operator-controller-operator-677d5bb784-zdlwp\" (UID: \"f8c7a5a4-11d3-4d38-95d5-fb90e97378b9\") " pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp" Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.578450 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8xxd\" (UniqueName: \"kubernetes.io/projected/f8c7a5a4-11d3-4d38-95d5-fb90e97378b9-kube-api-access-x8xxd\") pod \"openstack-operator-controller-operator-677d5bb784-zdlwp\" (UID: \"f8c7a5a4-11d3-4d38-95d5-fb90e97378b9\") " pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp" Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.607077 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8xxd\" (UniqueName: \"kubernetes.io/projected/f8c7a5a4-11d3-4d38-95d5-fb90e97378b9-kube-api-access-x8xxd\") pod \"openstack-operator-controller-operator-677d5bb784-zdlwp\" (UID: \"f8c7a5a4-11d3-4d38-95d5-fb90e97378b9\") " pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp" Oct 06 06:57:54 crc kubenswrapper[4845]: I1006 06:57:54.654865 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp" Oct 06 06:57:55 crc kubenswrapper[4845]: I1006 06:57:55.080606 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp"] Oct 06 06:57:55 crc kubenswrapper[4845]: I1006 06:57:55.191418 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp" event={"ID":"f8c7a5a4-11d3-4d38-95d5-fb90e97378b9","Type":"ContainerStarted","Data":"5f76e35649517f7fe0e8aad82aba011206104c6ce09041bec6afe155ff3ae4d8"} Oct 06 06:57:59 crc kubenswrapper[4845]: I1006 06:57:59.219961 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp" event={"ID":"f8c7a5a4-11d3-4d38-95d5-fb90e97378b9","Type":"ContainerStarted","Data":"72117303ee911dfc9cf202ec60166d77897486c82ba9148ce60a46f1ecc7b9bf"} Oct 06 06:58:01 crc kubenswrapper[4845]: I1006 06:58:01.235926 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp" event={"ID":"f8c7a5a4-11d3-4d38-95d5-fb90e97378b9","Type":"ContainerStarted","Data":"305105e533f69042854d0bb7a799a9243f4faf76152bb1d30c53f4abaf8ad392"} Oct 06 06:58:01 crc kubenswrapper[4845]: I1006 06:58:01.236236 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp" Oct 06 06:58:01 crc kubenswrapper[4845]: I1006 06:58:01.264927 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp" podStartSLOduration=1.543996603 podStartE2EDuration="7.264909015s" podCreationTimestamp="2025-10-06 06:57:54 +0000 UTC" firstStartedPulling="2025-10-06 06:57:55.093906954 +0000 UTC m=+759.608647972" lastFinishedPulling="2025-10-06 06:58:00.814819376 +0000 UTC m=+765.329560384" observedRunningTime="2025-10-06 06:58:01.259743265 +0000 UTC m=+765.774484313" watchObservedRunningTime="2025-10-06 06:58:01.264909015 +0000 UTC m=+765.779650033" Oct 06 06:58:04 crc kubenswrapper[4845]: I1006 06:58:04.657289 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-zdlwp" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.728085 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.729502 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.731882 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-9ztgb" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.748182 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.758408 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.761134 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.761856 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.762857 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.763586 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-9pjcd" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.771416 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-mdj2c" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.778501 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.784657 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.794369 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.798729 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.808217 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-jxtpn" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.813014 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.817037 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.843619 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-422pc" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.845934 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.875126 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.875703 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7sk7\" (UniqueName: \"kubernetes.io/projected/0a59590c-5261-403e-a7e3-0e726a025412-kube-api-access-c7sk7\") pod \"designate-operator-controller-manager-58d86cd59d-hf7mm\" (UID: \"0a59590c-5261-403e-a7e3-0e726a025412\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.875796 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55qtt\" (UniqueName: \"kubernetes.io/projected/2e4d8467-600b-4ae3-8b1a-c4f0416718f7-kube-api-access-55qtt\") pod \"barbican-operator-controller-manager-5b974f6766-qzbsb\" (UID: \"2e4d8467-600b-4ae3-8b1a-c4f0416718f7\") " pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.875857 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2hj4\" (UniqueName: \"kubernetes.io/projected/4629df36-951c-461d-9b54-c69cbec8bcd5-kube-api-access-r2hj4\") pod \"cinder-operator-controller-manager-84bd8f6848-mvrqc\" (UID: \"4629df36-951c-461d-9b54-c69cbec8bcd5\") " pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.880759 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.881861 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.886612 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-g7pj2" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.912745 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.913943 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.916883 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-td7t8" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.917082 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.932027 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.942577 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.949467 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.950593 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.952498 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-xlsld" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.962038 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.963036 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.968163 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-njddl" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.973462 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.976765 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55qtt\" (UniqueName: \"kubernetes.io/projected/2e4d8467-600b-4ae3-8b1a-c4f0416718f7-kube-api-access-55qtt\") pod \"barbican-operator-controller-manager-5b974f6766-qzbsb\" (UID: \"2e4d8467-600b-4ae3-8b1a-c4f0416718f7\") " pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.976811 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfcdc\" (UniqueName: \"kubernetes.io/projected/5166cd91-6a8c-4b81-b311-2a1e561928d3-kube-api-access-gfcdc\") pod \"heat-operator-controller-manager-5c497dbdb-nw58b\" (UID: \"5166cd91-6a8c-4b81-b311-2a1e561928d3\") " pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.976863 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2hj4\" (UniqueName: \"kubernetes.io/projected/4629df36-951c-461d-9b54-c69cbec8bcd5-kube-api-access-r2hj4\") pod \"cinder-operator-controller-manager-84bd8f6848-mvrqc\" (UID: \"4629df36-951c-461d-9b54-c69cbec8bcd5\") " pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.976889 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7sk7\" (UniqueName: \"kubernetes.io/projected/0a59590c-5261-403e-a7e3-0e726a025412-kube-api-access-c7sk7\") pod \"designate-operator-controller-manager-58d86cd59d-hf7mm\" (UID: \"0a59590c-5261-403e-a7e3-0e726a025412\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.976917 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbcvs\" (UniqueName: \"kubernetes.io/projected/540c7899-a612-4745-8c42-02033d088f73-kube-api-access-fbcvs\") pod \"glance-operator-controller-manager-698456cdc6-kdbk5\" (UID: \"540c7899-a612-4745-8c42-02033d088f73\") " pod="openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.976948 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xpwq\" (UniqueName: \"kubernetes.io/projected/c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd-kube-api-access-5xpwq\") pod \"horizon-operator-controller-manager-6675647785-4xx5n\" (UID: \"c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd\") " pod="openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.984971 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt"] Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.985953 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt" Oct 06 06:58:21 crc kubenswrapper[4845]: I1006 06:58:21.989353 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vn664" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.010743 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.012427 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.023986 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55qtt\" (UniqueName: \"kubernetes.io/projected/2e4d8467-600b-4ae3-8b1a-c4f0416718f7-kube-api-access-55qtt\") pod \"barbican-operator-controller-manager-5b974f6766-qzbsb\" (UID: \"2e4d8467-600b-4ae3-8b1a-c4f0416718f7\") " pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.027137 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2hj4\" (UniqueName: \"kubernetes.io/projected/4629df36-951c-461d-9b54-c69cbec8bcd5-kube-api-access-r2hj4\") pod \"cinder-operator-controller-manager-84bd8f6848-mvrqc\" (UID: \"4629df36-951c-461d-9b54-c69cbec8bcd5\") " pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.028088 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-qt9ts" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.028228 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.040981 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7sk7\" (UniqueName: \"kubernetes.io/projected/0a59590c-5261-403e-a7e3-0e726a025412-kube-api-access-c7sk7\") pod \"designate-operator-controller-manager-58d86cd59d-hf7mm\" (UID: \"0a59590c-5261-403e-a7e3-0e726a025412\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.045050 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.049840 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.054681 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.064439 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.065506 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.073476 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-sb2zt" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.079961 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr22m\" (UniqueName: \"kubernetes.io/projected/3f91fed8-7759-443b-869a-886f63b42502-kube-api-access-xr22m\") pod \"infra-operator-controller-manager-84788b6bc5-s8lqw\" (UID: \"3f91fed8-7759-443b-869a-886f63b42502\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.080017 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xpwq\" (UniqueName: \"kubernetes.io/projected/c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd-kube-api-access-5xpwq\") pod \"horizon-operator-controller-manager-6675647785-4xx5n\" (UID: \"c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd\") " pod="openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.080070 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmqd2\" (UniqueName: \"kubernetes.io/projected/09e0aaab-0038-4f9b-881e-3774781f2825-kube-api-access-wmqd2\") pod \"keystone-operator-controller-manager-57c9cdcf57-tf759\" (UID: \"09e0aaab-0038-4f9b-881e-3774781f2825\") " pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.080131 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfcdc\" (UniqueName: \"kubernetes.io/projected/5166cd91-6a8c-4b81-b311-2a1e561928d3-kube-api-access-gfcdc\") pod \"heat-operator-controller-manager-5c497dbdb-nw58b\" (UID: \"5166cd91-6a8c-4b81-b311-2a1e561928d3\") " pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.080161 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9pz2\" (UniqueName: \"kubernetes.io/projected/12269390-1665-4934-8695-eab596535e81-kube-api-access-f9pz2\") pod \"manila-operator-controller-manager-7cb48dbc-6vhjt\" (UID: \"12269390-1665-4934-8695-eab596535e81\") " pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.080181 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f91fed8-7759-443b-869a-886f63b42502-cert\") pod \"infra-operator-controller-manager-84788b6bc5-s8lqw\" (UID: \"3f91fed8-7759-443b-869a-886f63b42502\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.080214 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4rv2\" (UniqueName: \"kubernetes.io/projected/01476e52-bab1-4b3b-b2d9-3a9d9469c943-kube-api-access-b4rv2\") pod \"ironic-operator-controller-manager-6f5894c49f-lx8hg\" (UID: \"01476e52-bab1-4b3b-b2d9-3a9d9469c943\") " pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.080245 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbcvs\" (UniqueName: \"kubernetes.io/projected/540c7899-a612-4745-8c42-02033d088f73-kube-api-access-fbcvs\") pod \"glance-operator-controller-manager-698456cdc6-kdbk5\" (UID: \"540c7899-a612-4745-8c42-02033d088f73\") " pod="openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.081022 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.098866 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.107480 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.109771 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xpwq\" (UniqueName: \"kubernetes.io/projected/c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd-kube-api-access-5xpwq\") pod \"horizon-operator-controller-manager-6675647785-4xx5n\" (UID: \"c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd\") " pod="openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.113048 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbcvs\" (UniqueName: \"kubernetes.io/projected/540c7899-a612-4745-8c42-02033d088f73-kube-api-access-fbcvs\") pod \"glance-operator-controller-manager-698456cdc6-kdbk5\" (UID: \"540c7899-a612-4745-8c42-02033d088f73\") " pod="openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.117610 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfcdc\" (UniqueName: \"kubernetes.io/projected/5166cd91-6a8c-4b81-b311-2a1e561928d3-kube-api-access-gfcdc\") pod \"heat-operator-controller-manager-5c497dbdb-nw58b\" (UID: \"5166cd91-6a8c-4b81-b311-2a1e561928d3\") " pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.143975 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.152789 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.172349 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-kbq6v" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.173074 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.175859 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.179903 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.180023 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.181038 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmqd2\" (UniqueName: \"kubernetes.io/projected/09e0aaab-0038-4f9b-881e-3774781f2825-kube-api-access-wmqd2\") pod \"keystone-operator-controller-manager-57c9cdcf57-tf759\" (UID: \"09e0aaab-0038-4f9b-881e-3774781f2825\") " pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.181094 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b27gb\" (UniqueName: \"kubernetes.io/projected/223d7355-4741-4d59-b6cf-e71702ddc20e-kube-api-access-b27gb\") pod \"mariadb-operator-controller-manager-d6c9dc5bc-wghng\" (UID: \"223d7355-4741-4d59-b6cf-e71702ddc20e\") " pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.181118 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9pz2\" (UniqueName: \"kubernetes.io/projected/12269390-1665-4934-8695-eab596535e81-kube-api-access-f9pz2\") pod \"manila-operator-controller-manager-7cb48dbc-6vhjt\" (UID: \"12269390-1665-4934-8695-eab596535e81\") " pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.181139 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f91fed8-7759-443b-869a-886f63b42502-cert\") pod \"infra-operator-controller-manager-84788b6bc5-s8lqw\" (UID: \"3f91fed8-7759-443b-869a-886f63b42502\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.181168 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4rv2\" (UniqueName: \"kubernetes.io/projected/01476e52-bab1-4b3b-b2d9-3a9d9469c943-kube-api-access-b4rv2\") pod \"ironic-operator-controller-manager-6f5894c49f-lx8hg\" (UID: \"01476e52-bab1-4b3b-b2d9-3a9d9469c943\") " pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.181200 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-422ql\" (UniqueName: \"kubernetes.io/projected/49f6ba5b-4750-418f-ac64-574d92bf6f61-kube-api-access-422ql\") pod \"neutron-operator-controller-manager-69b956fbf6-c4htj\" (UID: \"49f6ba5b-4750-418f-ac64-574d92bf6f61\") " pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.181219 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr22m\" (UniqueName: \"kubernetes.io/projected/3f91fed8-7759-443b-869a-886f63b42502-kube-api-access-xr22m\") pod \"infra-operator-controller-manager-84788b6bc5-s8lqw\" (UID: \"3f91fed8-7759-443b-869a-886f63b42502\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.181679 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-pkgxt" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.186533 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.187272 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f91fed8-7759-443b-869a-886f63b42502-cert\") pod \"infra-operator-controller-manager-84788b6bc5-s8lqw\" (UID: \"3f91fed8-7759-443b-869a-886f63b42502\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.190178 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.196430 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.198586 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.201353 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.201669 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-j7vd5" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.201764 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.202808 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.205688 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-8pd82" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.208933 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmqd2\" (UniqueName: \"kubernetes.io/projected/09e0aaab-0038-4f9b-881e-3774781f2825-kube-api-access-wmqd2\") pod \"keystone-operator-controller-manager-57c9cdcf57-tf759\" (UID: \"09e0aaab-0038-4f9b-881e-3774781f2825\") " pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.209159 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.212431 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4rv2\" (UniqueName: \"kubernetes.io/projected/01476e52-bab1-4b3b-b2d9-3a9d9469c943-kube-api-access-b4rv2\") pod \"ironic-operator-controller-manager-6f5894c49f-lx8hg\" (UID: \"01476e52-bab1-4b3b-b2d9-3a9d9469c943\") " pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.212969 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr22m\" (UniqueName: \"kubernetes.io/projected/3f91fed8-7759-443b-869a-886f63b42502-kube-api-access-xr22m\") pod \"infra-operator-controller-manager-84788b6bc5-s8lqw\" (UID: \"3f91fed8-7759-443b-869a-886f63b42502\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.213136 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.225832 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.238804 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.245751 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9pz2\" (UniqueName: \"kubernetes.io/projected/12269390-1665-4934-8695-eab596535e81-kube-api-access-f9pz2\") pod \"manila-operator-controller-manager-7cb48dbc-6vhjt\" (UID: \"12269390-1665-4934-8695-eab596535e81\") " pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.249208 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.250195 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.254033 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-tm2zx" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.262271 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.299864 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwrhf\" (UniqueName: \"kubernetes.io/projected/1877f632-ca26-4045-a192-08d2f0f97a4e-kube-api-access-xwrhf\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d582qhb\" (UID: \"1877f632-ca26-4045-a192-08d2f0f97a4e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.300727 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b27gb\" (UniqueName: \"kubernetes.io/projected/223d7355-4741-4d59-b6cf-e71702ddc20e-kube-api-access-b27gb\") pod \"mariadb-operator-controller-manager-d6c9dc5bc-wghng\" (UID: \"223d7355-4741-4d59-b6cf-e71702ddc20e\") " pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.300799 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1877f632-ca26-4045-a192-08d2f0f97a4e-cert\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d582qhb\" (UID: \"1877f632-ca26-4045-a192-08d2f0f97a4e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.300933 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9j76\" (UniqueName: \"kubernetes.io/projected/93be683f-25a1-477e-b676-5bc7be2c3bf8-kube-api-access-w9j76\") pod \"nova-operator-controller-manager-6c9b57c67-wrcqz\" (UID: \"93be683f-25a1-477e-b676-5bc7be2c3bf8\") " pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.301015 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-422ql\" (UniqueName: \"kubernetes.io/projected/49f6ba5b-4750-418f-ac64-574d92bf6f61-kube-api-access-422ql\") pod \"neutron-operator-controller-manager-69b956fbf6-c4htj\" (UID: \"49f6ba5b-4750-418f-ac64-574d92bf6f61\") " pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.301156 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.301787 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-9v85j"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.306676 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-76d5577b-9v85j" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.307631 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxzbm\" (UniqueName: \"kubernetes.io/projected/58567cef-7ff1-455c-a1d3-1f7a6f35a504-kube-api-access-cxzbm\") pod \"octavia-operator-controller-manager-69f59f9d8-t66dv\" (UID: \"58567cef-7ff1-455c-a1d3-1f7a6f35a504\") " pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.314510 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.325238 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-tb8zs" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.328130 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.342996 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b27gb\" (UniqueName: \"kubernetes.io/projected/223d7355-4741-4d59-b6cf-e71702ddc20e-kube-api-access-b27gb\") pod \"mariadb-operator-controller-manager-d6c9dc5bc-wghng\" (UID: \"223d7355-4741-4d59-b6cf-e71702ddc20e\") " pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.351814 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-422ql\" (UniqueName: \"kubernetes.io/projected/49f6ba5b-4750-418f-ac64-574d92bf6f61-kube-api-access-422ql\") pod \"neutron-operator-controller-manager-69b956fbf6-c4htj\" (UID: \"49f6ba5b-4750-418f-ac64-574d92bf6f61\") " pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.355639 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.358168 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.360205 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-9v85j"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.368209 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.371619 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-x422d" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.410310 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.416809 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.421624 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8d2cj" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.422021 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.433006 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwrhf\" (UniqueName: \"kubernetes.io/projected/1877f632-ca26-4045-a192-08d2f0f97a4e-kube-api-access-xwrhf\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d582qhb\" (UID: \"1877f632-ca26-4045-a192-08d2f0f97a4e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.433065 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1877f632-ca26-4045-a192-08d2f0f97a4e-cert\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d582qhb\" (UID: \"1877f632-ca26-4045-a192-08d2f0f97a4e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.433102 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9j76\" (UniqueName: \"kubernetes.io/projected/93be683f-25a1-477e-b676-5bc7be2c3bf8-kube-api-access-w9j76\") pod \"nova-operator-controller-manager-6c9b57c67-wrcqz\" (UID: \"93be683f-25a1-477e-b676-5bc7be2c3bf8\") " pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.433147 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgm8x\" (UniqueName: \"kubernetes.io/projected/38e8455f-f063-46aa-8275-20b6d80aa9ea-kube-api-access-pgm8x\") pod \"swift-operator-controller-manager-76d5577b-9v85j\" (UID: \"38e8455f-f063-46aa-8275-20b6d80aa9ea\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-9v85j" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.433166 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz9lg\" (UniqueName: \"kubernetes.io/projected/c21520ff-b41e-4433-8656-d248f0975c60-kube-api-access-pz9lg\") pod \"placement-operator-controller-manager-66f6d6849b-5qmws\" (UID: \"c21520ff-b41e-4433-8656-d248f0975c60\") " pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.433188 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxzbm\" (UniqueName: \"kubernetes.io/projected/58567cef-7ff1-455c-a1d3-1f7a6f35a504-kube-api-access-cxzbm\") pod \"octavia-operator-controller-manager-69f59f9d8-t66dv\" (UID: \"58567cef-7ff1-455c-a1d3-1f7a6f35a504\") " pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.433210 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btcd7\" (UniqueName: \"kubernetes.io/projected/8342c3c4-6f77-4647-a4c2-9f834c55ee19-kube-api-access-btcd7\") pod \"ovn-operator-controller-manager-c968bb45-h4m5g\" (UID: \"8342c3c4-6f77-4647-a4c2-9f834c55ee19\") " pod="openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g" Oct 06 06:58:22 crc kubenswrapper[4845]: E1006 06:58:22.433978 4845 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 06:58:22 crc kubenswrapper[4845]: E1006 06:58:22.434023 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1877f632-ca26-4045-a192-08d2f0f97a4e-cert podName:1877f632-ca26-4045-a192-08d2f0f97a4e nodeName:}" failed. No retries permitted until 2025-10-06 06:58:22.934009996 +0000 UTC m=+787.448751004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1877f632-ca26-4045-a192-08d2f0f97a4e-cert") pod "openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" (UID: "1877f632-ca26-4045-a192-08d2f0f97a4e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.446445 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.447506 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.464884 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.464936 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-xflxj" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.475925 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.487211 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.488550 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.494150 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-q27ts" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.494435 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.502137 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.508303 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwrhf\" (UniqueName: \"kubernetes.io/projected/1877f632-ca26-4045-a192-08d2f0f97a4e-kube-api-access-xwrhf\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d582qhb\" (UID: \"1877f632-ca26-4045-a192-08d2f0f97a4e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.513017 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxzbm\" (UniqueName: \"kubernetes.io/projected/58567cef-7ff1-455c-a1d3-1f7a6f35a504-kube-api-access-cxzbm\") pod \"octavia-operator-controller-manager-69f59f9d8-t66dv\" (UID: \"58567cef-7ff1-455c-a1d3-1f7a6f35a504\") " pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.526873 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9j76\" (UniqueName: \"kubernetes.io/projected/93be683f-25a1-477e-b676-5bc7be2c3bf8-kube-api-access-w9j76\") pod \"nova-operator-controller-manager-6c9b57c67-wrcqz\" (UID: \"93be683f-25a1-477e-b676-5bc7be2c3bf8\") " pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.527453 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.534122 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zglcw\" (UniqueName: \"kubernetes.io/projected/e3721fe9-cb88-4326-aa90-2d08b909515e-kube-api-access-zglcw\") pod \"telemetry-operator-controller-manager-f589c7597-c2mlw\" (UID: \"e3721fe9-cb88-4326-aa90-2d08b909515e\") " pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.534179 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgm8x\" (UniqueName: \"kubernetes.io/projected/38e8455f-f063-46aa-8275-20b6d80aa9ea-kube-api-access-pgm8x\") pod \"swift-operator-controller-manager-76d5577b-9v85j\" (UID: \"38e8455f-f063-46aa-8275-20b6d80aa9ea\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-9v85j" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.534208 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz9lg\" (UniqueName: \"kubernetes.io/projected/c21520ff-b41e-4433-8656-d248f0975c60-kube-api-access-pz9lg\") pod \"placement-operator-controller-manager-66f6d6849b-5qmws\" (UID: \"c21520ff-b41e-4433-8656-d248f0975c60\") " pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.534231 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx752\" (UniqueName: \"kubernetes.io/projected/c921a4a9-e09a-4fd2-965e-c13f7fee169e-kube-api-access-hx752\") pod \"test-operator-controller-manager-6bb6dcddc-nsmkl\" (UID: \"c921a4a9-e09a-4fd2-965e-c13f7fee169e\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.534255 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btcd7\" (UniqueName: \"kubernetes.io/projected/8342c3c4-6f77-4647-a4c2-9f834c55ee19-kube-api-access-btcd7\") pod \"ovn-operator-controller-manager-c968bb45-h4m5g\" (UID: \"8342c3c4-6f77-4647-a4c2-9f834c55ee19\") " pod="openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.541127 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.550102 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.550922 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.556042 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6w6l8" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.561902 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.563388 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.573186 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgm8x\" (UniqueName: \"kubernetes.io/projected/38e8455f-f063-46aa-8275-20b6d80aa9ea-kube-api-access-pgm8x\") pod \"swift-operator-controller-manager-76d5577b-9v85j\" (UID: \"38e8455f-f063-46aa-8275-20b6d80aa9ea\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-9v85j" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.581287 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz9lg\" (UniqueName: \"kubernetes.io/projected/c21520ff-b41e-4433-8656-d248f0975c60-kube-api-access-pz9lg\") pod \"placement-operator-controller-manager-66f6d6849b-5qmws\" (UID: \"c21520ff-b41e-4433-8656-d248f0975c60\") " pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.581790 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btcd7\" (UniqueName: \"kubernetes.io/projected/8342c3c4-6f77-4647-a4c2-9f834c55ee19-kube-api-access-btcd7\") pod \"ovn-operator-controller-manager-c968bb45-h4m5g\" (UID: \"8342c3c4-6f77-4647-a4c2-9f834c55ee19\") " pod="openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.592708 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.620746 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.635516 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05de66df-c97e-40d4-a605-188b3d8e66eb-cert\") pod \"openstack-operator-controller-manager-7cfc658b9-dlvb8\" (UID: \"05de66df-c97e-40d4-a605-188b3d8e66eb\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.635563 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pppgx\" (UniqueName: \"kubernetes.io/projected/05de66df-c97e-40d4-a605-188b3d8e66eb-kube-api-access-pppgx\") pod \"openstack-operator-controller-manager-7cfc658b9-dlvb8\" (UID: \"05de66df-c97e-40d4-a605-188b3d8e66eb\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.635598 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zglcw\" (UniqueName: \"kubernetes.io/projected/e3721fe9-cb88-4326-aa90-2d08b909515e-kube-api-access-zglcw\") pod \"telemetry-operator-controller-manager-f589c7597-c2mlw\" (UID: \"e3721fe9-cb88-4326-aa90-2d08b909515e\") " pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.635638 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkfvj\" (UniqueName: \"kubernetes.io/projected/dcb43cda-fbc6-4092-bf5d-296858e233cd-kube-api-access-bkfvj\") pod \"watcher-operator-controller-manager-5d98cc5575-mj4wz\" (UID: \"dcb43cda-fbc6-4092-bf5d-296858e233cd\") " pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.635667 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx752\" (UniqueName: \"kubernetes.io/projected/c921a4a9-e09a-4fd2-965e-c13f7fee169e-kube-api-access-hx752\") pod \"test-operator-controller-manager-6bb6dcddc-nsmkl\" (UID: \"c921a4a9-e09a-4fd2-965e-c13f7fee169e\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.666188 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx752\" (UniqueName: \"kubernetes.io/projected/c921a4a9-e09a-4fd2-965e-c13f7fee169e-kube-api-access-hx752\") pod \"test-operator-controller-manager-6bb6dcddc-nsmkl\" (UID: \"c921a4a9-e09a-4fd2-965e-c13f7fee169e\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.672903 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-76d5577b-9v85j" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.673809 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zglcw\" (UniqueName: \"kubernetes.io/projected/e3721fe9-cb88-4326-aa90-2d08b909515e-kube-api-access-zglcw\") pod \"telemetry-operator-controller-manager-f589c7597-c2mlw\" (UID: \"e3721fe9-cb88-4326-aa90-2d08b909515e\") " pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.716499 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.736395 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h66qm\" (UniqueName: \"kubernetes.io/projected/880d791a-bcb7-4f71-8a16-015bd26af4d9-kube-api-access-h66qm\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw\" (UID: \"880d791a-bcb7-4f71-8a16-015bd26af4d9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.736457 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkfvj\" (UniqueName: \"kubernetes.io/projected/dcb43cda-fbc6-4092-bf5d-296858e233cd-kube-api-access-bkfvj\") pod \"watcher-operator-controller-manager-5d98cc5575-mj4wz\" (UID: \"dcb43cda-fbc6-4092-bf5d-296858e233cd\") " pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.736553 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05de66df-c97e-40d4-a605-188b3d8e66eb-cert\") pod \"openstack-operator-controller-manager-7cfc658b9-dlvb8\" (UID: \"05de66df-c97e-40d4-a605-188b3d8e66eb\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.736572 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pppgx\" (UniqueName: \"kubernetes.io/projected/05de66df-c97e-40d4-a605-188b3d8e66eb-kube-api-access-pppgx\") pod \"openstack-operator-controller-manager-7cfc658b9-dlvb8\" (UID: \"05de66df-c97e-40d4-a605-188b3d8e66eb\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" Oct 06 06:58:22 crc kubenswrapper[4845]: E1006 06:58:22.736940 4845 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 06 06:58:22 crc kubenswrapper[4845]: E1006 06:58:22.736977 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05de66df-c97e-40d4-a605-188b3d8e66eb-cert podName:05de66df-c97e-40d4-a605-188b3d8e66eb nodeName:}" failed. No retries permitted until 2025-10-06 06:58:23.236963701 +0000 UTC m=+787.751704709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05de66df-c97e-40d4-a605-188b3d8e66eb-cert") pod "openstack-operator-controller-manager-7cfc658b9-dlvb8" (UID: "05de66df-c97e-40d4-a605-188b3d8e66eb") : secret "webhook-server-cert" not found Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.759725 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pppgx\" (UniqueName: \"kubernetes.io/projected/05de66df-c97e-40d4-a605-188b3d8e66eb-kube-api-access-pppgx\") pod \"openstack-operator-controller-manager-7cfc658b9-dlvb8\" (UID: \"05de66df-c97e-40d4-a605-188b3d8e66eb\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.762575 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkfvj\" (UniqueName: \"kubernetes.io/projected/dcb43cda-fbc6-4092-bf5d-296858e233cd-kube-api-access-bkfvj\") pod \"watcher-operator-controller-manager-5d98cc5575-mj4wz\" (UID: \"dcb43cda-fbc6-4092-bf5d-296858e233cd\") " pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.786714 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.804328 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc"] Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.837266 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h66qm\" (UniqueName: \"kubernetes.io/projected/880d791a-bcb7-4f71-8a16-015bd26af4d9-kube-api-access-h66qm\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw\" (UID: \"880d791a-bcb7-4f71-8a16-015bd26af4d9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.900354 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h66qm\" (UniqueName: \"kubernetes.io/projected/880d791a-bcb7-4f71-8a16-015bd26af4d9-kube-api-access-h66qm\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw\" (UID: \"880d791a-bcb7-4f71-8a16-015bd26af4d9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw" Oct 06 06:58:22 crc kubenswrapper[4845]: W1006 06:58:22.904929 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4629df36_951c_461d_9b54_c69cbec8bcd5.slice/crio-a9953e1f73b8c81c7fa2cd5e9a998a772b001f252269ccc684ab08c95ebb6f75 WatchSource:0}: Error finding container a9953e1f73b8c81c7fa2cd5e9a998a772b001f252269ccc684ab08c95ebb6f75: Status 404 returned error can't find the container with id a9953e1f73b8c81c7fa2cd5e9a998a772b001f252269ccc684ab08c95ebb6f75 Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.932811 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" Oct 06 06:58:22 crc kubenswrapper[4845]: I1006 06:58:22.938457 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1877f632-ca26-4045-a192-08d2f0f97a4e-cert\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d582qhb\" (UID: \"1877f632-ca26-4045-a192-08d2f0f97a4e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" Oct 06 06:58:22 crc kubenswrapper[4845]: E1006 06:58:22.938619 4845 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 06:58:22 crc kubenswrapper[4845]: E1006 06:58:22.938660 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1877f632-ca26-4045-a192-08d2f0f97a4e-cert podName:1877f632-ca26-4045-a192-08d2f0f97a4e nodeName:}" failed. No retries permitted until 2025-10-06 06:58:23.938647847 +0000 UTC m=+788.453388855 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1877f632-ca26-4045-a192-08d2f0f97a4e-cert") pod "openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" (UID: "1877f632-ca26-4045-a192-08d2f0f97a4e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.046127 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw" Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.233617 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.254908 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05de66df-c97e-40d4-a605-188b3d8e66eb-cert\") pod \"openstack-operator-controller-manager-7cfc658b9-dlvb8\" (UID: \"05de66df-c97e-40d4-a605-188b3d8e66eb\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.263147 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05de66df-c97e-40d4-a605-188b3d8e66eb-cert\") pod \"openstack-operator-controller-manager-7cfc658b9-dlvb8\" (UID: \"05de66df-c97e-40d4-a605-188b3d8e66eb\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" Oct 06 06:58:23 crc kubenswrapper[4845]: W1006 06:58:23.280256 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a59590c_5261_403e_a7e3_0e726a025412.slice/crio-3729072c75e8879111547cb47e96f1889d0a456c8a2167be38a4591c9b0a3c94 WatchSource:0}: Error finding container 3729072c75e8879111547cb47e96f1889d0a456c8a2167be38a4591c9b0a3c94: Status 404 returned error can't find the container with id 3729072c75e8879111547cb47e96f1889d0a456c8a2167be38a4591c9b0a3c94 Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.293664 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.322527 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.347807 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n"] Oct 06 06:58:23 crc kubenswrapper[4845]: W1006 06:58:23.356908 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3eb4ea2_4738_4a6e_9eed_f41f7a616cdd.slice/crio-ed51273c53153a413b99f7e73f3e6b676ffc9d0d59d2bebb3ceb94f4eb37b5ac WatchSource:0}: Error finding container ed51273c53153a413b99f7e73f3e6b676ffc9d0d59d2bebb3ceb94f4eb37b5ac: Status 404 returned error can't find the container with id ed51273c53153a413b99f7e73f3e6b676ffc9d0d59d2bebb3ceb94f4eb37b5ac Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.375110 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.393754 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.401530 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n" event={"ID":"c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd","Type":"ContainerStarted","Data":"ed51273c53153a413b99f7e73f3e6b676ffc9d0d59d2bebb3ceb94f4eb37b5ac"} Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.403627 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg" event={"ID":"01476e52-bab1-4b3b-b2d9-3a9d9469c943","Type":"ContainerStarted","Data":"bfae93c3779bd1b63248236f21b112f349e7cf62053d4bb4613a4bd62bf35ef0"} Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.406589 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb" event={"ID":"2e4d8467-600b-4ae3-8b1a-c4f0416718f7","Type":"ContainerStarted","Data":"5c5b5104261503172a4dadd47894c3a3037c6a23c09defc3a490fa2bd61be2b9"} Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.414289 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm" event={"ID":"0a59590c-5261-403e-a7e3-0e726a025412","Type":"ContainerStarted","Data":"3729072c75e8879111547cb47e96f1889d0a456c8a2167be38a4591c9b0a3c94"} Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.420833 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc" event={"ID":"4629df36-951c-461d-9b54-c69cbec8bcd5","Type":"ContainerStarted","Data":"a9953e1f73b8c81c7fa2cd5e9a998a772b001f252269ccc684ab08c95ebb6f75"} Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.457550 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5"] Oct 06 06:58:23 crc kubenswrapper[4845]: W1006 06:58:23.468572 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod540c7899_a612_4745_8c42_02033d088f73.slice/crio-b315e4fdbb37264256c9cae06ecebde5925407746af30444303d556bddf913e0 WatchSource:0}: Error finding container b315e4fdbb37264256c9cae06ecebde5925407746af30444303d556bddf913e0: Status 404 returned error can't find the container with id b315e4fdbb37264256c9cae06ecebde5925407746af30444303d556bddf913e0 Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.481919 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b"] Oct 06 06:58:23 crc kubenswrapper[4845]: W1006 06:58:23.491035 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5166cd91_6a8c_4b81_b311_2a1e561928d3.slice/crio-4b7c2d064b0f95f8afc3eaefcfa3913bde2c21dce890c1d5686651b4b64be024 WatchSource:0}: Error finding container 4b7c2d064b0f95f8afc3eaefcfa3913bde2c21dce890c1d5686651b4b64be024: Status 404 returned error can't find the container with id 4b7c2d064b0f95f8afc3eaefcfa3913bde2c21dce890c1d5686651b4b64be024 Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.761574 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-9v85j"] Oct 06 06:58:23 crc kubenswrapper[4845]: W1006 06:58:23.773439 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38e8455f_f063_46aa_8275_20b6d80aa9ea.slice/crio-722bf69518e0cbe4b01ebcce3a23166aa3c8913b159acfe558192f00ee4f3ce6 WatchSource:0}: Error finding container 722bf69518e0cbe4b01ebcce3a23166aa3c8913b159acfe558192f00ee4f3ce6: Status 404 returned error can't find the container with id 722bf69518e0cbe4b01ebcce3a23166aa3c8913b159acfe558192f00ee4f3ce6 Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.777840 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv"] Oct 06 06:58:23 crc kubenswrapper[4845]: W1006 06:58:23.789750 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58567cef_7ff1_455c_a1d3_1f7a6f35a504.slice/crio-0e7d55deee7b165b08262abc8cfea080a18bbb20d1ac31419ab3aef421ab7314 WatchSource:0}: Error finding container 0e7d55deee7b165b08262abc8cfea080a18bbb20d1ac31419ab3aef421ab7314: Status 404 returned error can't find the container with id 0e7d55deee7b165b08262abc8cfea080a18bbb20d1ac31419ab3aef421ab7314 Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.821166 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.840763 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.849158 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.858396 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw"] Oct 06 06:58:23 crc kubenswrapper[4845]: W1006 06:58:23.865432 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8342c3c4_6f77_4647_a4c2_9f834c55ee19.slice/crio-4f194c9443a4049b6e61201ad4456d7e6c967cf4e5838777397ab52abdb27856 WatchSource:0}: Error finding container 4f194c9443a4049b6e61201ad4456d7e6c967cf4e5838777397ab52abdb27856: Status 404 returned error can't find the container with id 4f194c9443a4049b6e61201ad4456d7e6c967cf4e5838777397ab52abdb27856 Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.872463 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws"] Oct 06 06:58:23 crc kubenswrapper[4845]: W1006 06:58:23.874906 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12269390_1665_4934_8695_eab596535e81.slice/crio-40473f52015c442331e522755518b450a02e31247c7cd37ad5b2e5ee3312f747 WatchSource:0}: Error finding container 40473f52015c442331e522755518b450a02e31247c7cd37ad5b2e5ee3312f747: Status 404 returned error can't find the container with id 40473f52015c442331e522755518b450a02e31247c7cd37ad5b2e5ee3312f747 Oct 06 06:58:23 crc kubenswrapper[4845]: W1006 06:58:23.876390 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3721fe9_cb88_4326_aa90_2d08b909515e.slice/crio-34dc3a8600e7ab22e841878ad6fbe2e86ea0e70cd25738a89f59ab737e0c66ab WatchSource:0}: Error finding container 34dc3a8600e7ab22e841878ad6fbe2e86ea0e70cd25738a89f59ab737e0c66ab: Status 404 returned error can't find the container with id 34dc3a8600e7ab22e841878ad6fbe2e86ea0e70cd25738a89f59ab737e0c66ab Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.879189 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt"] Oct 06 06:58:23 crc kubenswrapper[4845]: E1006 06:58:23.889128 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zglcw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-f589c7597-c2mlw_openstack-operators(e3721fe9-cb88-4326-aa90-2d08b909515e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.893518 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj"] Oct 06 06:58:23 crc kubenswrapper[4845]: E1006 06:58:23.897701 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hx752,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6bb6dcddc-nsmkl_openstack-operators(c921a4a9-e09a-4fd2-965e-c13f7fee169e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 06:58:23 crc kubenswrapper[4845]: E1006 06:58:23.898607 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h66qm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw_openstack-operators(880d791a-bcb7-4f71-8a16-015bd26af4d9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 06:58:23 crc kubenswrapper[4845]: E1006 06:58:23.898701 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-422ql,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-69b956fbf6-c4htj_openstack-operators(49f6ba5b-4750-418f-ac64-574d92bf6f61): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 06:58:23 crc kubenswrapper[4845]: E1006 06:58:23.898737 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w9j76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-6c9b57c67-wrcqz_openstack-operators(93be683f-25a1-477e-b676-5bc7be2c3bf8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 06:58:23 crc kubenswrapper[4845]: E1006 06:58:23.898835 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pz9lg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-66f6d6849b-5qmws_openstack-operators(c21520ff-b41e-4433-8656-d248f0975c60): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 06:58:23 crc kubenswrapper[4845]: E1006 06:58:23.899152 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bkfvj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5d98cc5575-mj4wz_openstack-operators(dcb43cda-fbc6-4092-bf5d-296858e233cd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 06:58:23 crc kubenswrapper[4845]: E1006 06:58:23.901039 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw" podUID="880d791a-bcb7-4f71-8a16-015bd26af4d9" Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.904458 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.911671 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.923214 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.931481 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.942164 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.947336 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lfwvb"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.952201 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfwvb"] Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.952592 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.970120 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1877f632-ca26-4045-a192-08d2f0f97a4e-cert\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d582qhb\" (UID: \"1877f632-ca26-4045-a192-08d2f0f97a4e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" Oct 06 06:58:23 crc kubenswrapper[4845]: I1006 06:58:23.977887 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1877f632-ca26-4045-a192-08d2f0f97a4e-cert\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d582qhb\" (UID: \"1877f632-ca26-4045-a192-08d2f0f97a4e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.070778 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" podUID="e3721fe9-cb88-4326-aa90-2d08b909515e" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.071011 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440a58e5-104c-4791-9cba-e62a0afbe4a2-catalog-content\") pod \"redhat-marketplace-lfwvb\" (UID: \"440a58e5-104c-4791-9cba-e62a0afbe4a2\") " pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.071069 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9p62\" (UniqueName: \"kubernetes.io/projected/440a58e5-104c-4791-9cba-e62a0afbe4a2-kube-api-access-n9p62\") pod \"redhat-marketplace-lfwvb\" (UID: \"440a58e5-104c-4791-9cba-e62a0afbe4a2\") " pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.071088 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440a58e5-104c-4791-9cba-e62a0afbe4a2-utilities\") pod \"redhat-marketplace-lfwvb\" (UID: \"440a58e5-104c-4791-9cba-e62a0afbe4a2\") " pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.076256 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.172660 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440a58e5-104c-4791-9cba-e62a0afbe4a2-catalog-content\") pod \"redhat-marketplace-lfwvb\" (UID: \"440a58e5-104c-4791-9cba-e62a0afbe4a2\") " pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.173158 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9p62\" (UniqueName: \"kubernetes.io/projected/440a58e5-104c-4791-9cba-e62a0afbe4a2-kube-api-access-n9p62\") pod \"redhat-marketplace-lfwvb\" (UID: \"440a58e5-104c-4791-9cba-e62a0afbe4a2\") " pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.173177 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440a58e5-104c-4791-9cba-e62a0afbe4a2-utilities\") pod \"redhat-marketplace-lfwvb\" (UID: \"440a58e5-104c-4791-9cba-e62a0afbe4a2\") " pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.173466 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440a58e5-104c-4791-9cba-e62a0afbe4a2-utilities\") pod \"redhat-marketplace-lfwvb\" (UID: \"440a58e5-104c-4791-9cba-e62a0afbe4a2\") " pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.173099 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440a58e5-104c-4791-9cba-e62a0afbe4a2-catalog-content\") pod \"redhat-marketplace-lfwvb\" (UID: \"440a58e5-104c-4791-9cba-e62a0afbe4a2\") " pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.197501 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9p62\" (UniqueName: \"kubernetes.io/projected/440a58e5-104c-4791-9cba-e62a0afbe4a2-kube-api-access-n9p62\") pod \"redhat-marketplace-lfwvb\" (UID: \"440a58e5-104c-4791-9cba-e62a0afbe4a2\") " pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.314057 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" podUID="dcb43cda-fbc6-4092-bf5d-296858e233cd" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.320441 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.336939 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" podUID="c21520ff-b41e-4433-8656-d248f0975c60" Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.348653 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" podUID="93be683f-25a1-477e-b676-5bc7be2c3bf8" Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.358611 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" podUID="49f6ba5b-4750-418f-ac64-574d92bf6f61" Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.362705 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" podUID="c921a4a9-e09a-4fd2-965e-c13f7fee169e" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.438296 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng" event={"ID":"223d7355-4741-4d59-b6cf-e71702ddc20e","Type":"ContainerStarted","Data":"b49751077bd15572158cc3e87c3771a964f6d9f1aa3b2546625c31d47d2cab10"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.442735 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" event={"ID":"05de66df-c97e-40d4-a605-188b3d8e66eb","Type":"ContainerStarted","Data":"a5de92ab7efed6dded7bafe12a257aad0df3dbada2eb191b074b8d03228c3958"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.442780 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" event={"ID":"05de66df-c97e-40d4-a605-188b3d8e66eb","Type":"ContainerStarted","Data":"3849085e04ef7aca59039b36ea5a32f25ff97e555e0ddeb615d0124007562b67"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.457688 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" event={"ID":"3f91fed8-7759-443b-869a-886f63b42502","Type":"ContainerStarted","Data":"0ad7212a807666880a88b875a40517545129bffd93b0a4dca58b8b1dd021e770"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.461860 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" event={"ID":"c21520ff-b41e-4433-8656-d248f0975c60","Type":"ContainerStarted","Data":"9d665b917f2f6be6f959c44f998ca0553ef9755be33b5a1e41657760b9b6b6c1"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.461913 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" event={"ID":"c21520ff-b41e-4433-8656-d248f0975c60","Type":"ContainerStarted","Data":"01109b83c328aa1db3e07705230640e3546576985a3cd75f3ec5a5f8be13fd11"} Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.464520 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" podUID="c21520ff-b41e-4433-8656-d248f0975c60" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.466289 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv" event={"ID":"58567cef-7ff1-455c-a1d3-1f7a6f35a504","Type":"ContainerStarted","Data":"0e7d55deee7b165b08262abc8cfea080a18bbb20d1ac31419ab3aef421ab7314"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.473799 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g" event={"ID":"8342c3c4-6f77-4647-a4c2-9f834c55ee19","Type":"ContainerStarted","Data":"4f194c9443a4049b6e61201ad4456d7e6c967cf4e5838777397ab52abdb27856"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.495581 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" event={"ID":"e3721fe9-cb88-4326-aa90-2d08b909515e","Type":"ContainerStarted","Data":"47da9ae934844e20ebb5dec9871434d683959845dc59c89bafd7dcce6dbc3293"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.495664 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" event={"ID":"e3721fe9-cb88-4326-aa90-2d08b909515e","Type":"ContainerStarted","Data":"34dc3a8600e7ab22e841878ad6fbe2e86ea0e70cd25738a89f59ab737e0c66ab"} Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.502462 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" podUID="e3721fe9-cb88-4326-aa90-2d08b909515e" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.520561 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5" event={"ID":"540c7899-a612-4745-8c42-02033d088f73","Type":"ContainerStarted","Data":"b315e4fdbb37264256c9cae06ecebde5925407746af30444303d556bddf913e0"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.557699 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" event={"ID":"93be683f-25a1-477e-b676-5bc7be2c3bf8","Type":"ContainerStarted","Data":"52538b30bd70284dd04fc9253fd1db2b715ed361e4d678facb9c1b64edbc6f87"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.557763 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" event={"ID":"93be683f-25a1-477e-b676-5bc7be2c3bf8","Type":"ContainerStarted","Data":"e3ad604b92d76e36dd295b44628a7770ffde7516d7fd4ca81a064d3e815e2af1"} Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.567535 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" podUID="93be683f-25a1-477e-b676-5bc7be2c3bf8" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.596598 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" event={"ID":"dcb43cda-fbc6-4092-bf5d-296858e233cd","Type":"ContainerStarted","Data":"7e647975e8edac8d118a64ec027113b5a8bc85e1ea754ad6066c9872259df666"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.596669 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" event={"ID":"dcb43cda-fbc6-4092-bf5d-296858e233cd","Type":"ContainerStarted","Data":"cf7680f272c9c6660fd7bfa5f999bd93a5313802b50c32a2ff4a0928a3276ebc"} Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.615148 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" podUID="dcb43cda-fbc6-4092-bf5d-296858e233cd" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.632037 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-9v85j" event={"ID":"38e8455f-f063-46aa-8275-20b6d80aa9ea","Type":"ContainerStarted","Data":"722bf69518e0cbe4b01ebcce3a23166aa3c8913b159acfe558192f00ee4f3ce6"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.646913 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759" event={"ID":"09e0aaab-0038-4f9b-881e-3774781f2825","Type":"ContainerStarted","Data":"d42ed486f8e90e1246324960ba9b127e4e87c06608d8b3c023a268ddd8394b91"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.648150 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw" event={"ID":"880d791a-bcb7-4f71-8a16-015bd26af4d9","Type":"ContainerStarted","Data":"784086a2fcc39ad36978a3e8bcd3e24e39468f95a20b350309e4366711f7f407"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.661820 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb"] Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.678516 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw" podUID="880d791a-bcb7-4f71-8a16-015bd26af4d9" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.681215 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b" event={"ID":"5166cd91-6a8c-4b81-b311-2a1e561928d3","Type":"ContainerStarted","Data":"4b7c2d064b0f95f8afc3eaefcfa3913bde2c21dce890c1d5686651b4b64be024"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.742055 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" event={"ID":"c921a4a9-e09a-4fd2-965e-c13f7fee169e","Type":"ContainerStarted","Data":"047a3b59a65ce5e164ffcdd5507c51ef5db87135b57bc06f0211f6664f56d231"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.742152 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" event={"ID":"c921a4a9-e09a-4fd2-965e-c13f7fee169e","Type":"ContainerStarted","Data":"c95ee28265c9c7e38a4f4ada8df4534e4c5532411ce8a783091853df00d230bc"} Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.745944 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" podUID="c921a4a9-e09a-4fd2-965e-c13f7fee169e" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.752517 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt" event={"ID":"12269390-1665-4934-8695-eab596535e81","Type":"ContainerStarted","Data":"40473f52015c442331e522755518b450a02e31247c7cd37ad5b2e5ee3312f747"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.782024 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" event={"ID":"49f6ba5b-4750-418f-ac64-574d92bf6f61","Type":"ContainerStarted","Data":"d53e4b6b1497924b4f8cae88ff029968702ebfaa3210d553cd8ba32680257b2f"} Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.783341 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" event={"ID":"49f6ba5b-4750-418f-ac64-574d92bf6f61","Type":"ContainerStarted","Data":"92de209b8c606f62458bf364d386f68239b6193855e8aad869435a61c46fb6ce"} Oct 06 06:58:24 crc kubenswrapper[4845]: E1006 06:58:24.818528 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" podUID="49f6ba5b-4750-418f-ac64-574d92bf6f61" Oct 06 06:58:24 crc kubenswrapper[4845]: I1006 06:58:24.906893 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfwvb"] Oct 06 06:58:24 crc kubenswrapper[4845]: W1006 06:58:24.933212 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod440a58e5_104c_4791_9cba_e62a0afbe4a2.slice/crio-5f3d65b1f61b92de5ee13daef0a3c6d14de822d0f35aa6f46b7d008448f64fbc WatchSource:0}: Error finding container 5f3d65b1f61b92de5ee13daef0a3c6d14de822d0f35aa6f46b7d008448f64fbc: Status 404 returned error can't find the container with id 5f3d65b1f61b92de5ee13daef0a3c6d14de822d0f35aa6f46b7d008448f64fbc Oct 06 06:58:25 crc kubenswrapper[4845]: E1006 06:58:25.360203 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod440a58e5_104c_4791_9cba_e62a0afbe4a2.slice/crio-conmon-48a55ba6250bd8daf8eb46bd619f6fadb520fe62add300a9dac3944826359b71.scope\": RecentStats: unable to find data in memory cache]" Oct 06 06:58:25 crc kubenswrapper[4845]: I1006 06:58:25.797227 4845 generic.go:334] "Generic (PLEG): container finished" podID="440a58e5-104c-4791-9cba-e62a0afbe4a2" containerID="48a55ba6250bd8daf8eb46bd619f6fadb520fe62add300a9dac3944826359b71" exitCode=0 Oct 06 06:58:25 crc kubenswrapper[4845]: I1006 06:58:25.797330 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfwvb" event={"ID":"440a58e5-104c-4791-9cba-e62a0afbe4a2","Type":"ContainerDied","Data":"48a55ba6250bd8daf8eb46bd619f6fadb520fe62add300a9dac3944826359b71"} Oct 06 06:58:25 crc kubenswrapper[4845]: I1006 06:58:25.797424 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfwvb" event={"ID":"440a58e5-104c-4791-9cba-e62a0afbe4a2","Type":"ContainerStarted","Data":"5f3d65b1f61b92de5ee13daef0a3c6d14de822d0f35aa6f46b7d008448f64fbc"} Oct 06 06:58:25 crc kubenswrapper[4845]: I1006 06:58:25.798792 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" event={"ID":"1877f632-ca26-4045-a192-08d2f0f97a4e","Type":"ContainerStarted","Data":"1d5aa208342167d270290fd0ff439d02d6a8d7dcc2fd82c0ee5e0945ae289a8d"} Oct 06 06:58:25 crc kubenswrapper[4845]: I1006 06:58:25.806040 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" event={"ID":"05de66df-c97e-40d4-a605-188b3d8e66eb","Type":"ContainerStarted","Data":"635216d08a06a5350ec7264d5edc1d0bb7ac88309d00ccd4057a6719dd8f74d3"} Oct 06 06:58:25 crc kubenswrapper[4845]: E1006 06:58:25.806970 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" podUID="c21520ff-b41e-4433-8656-d248f0975c60" Oct 06 06:58:25 crc kubenswrapper[4845]: I1006 06:58:25.807037 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" Oct 06 06:58:25 crc kubenswrapper[4845]: E1006 06:58:25.807489 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" podUID="e3721fe9-cb88-4326-aa90-2d08b909515e" Oct 06 06:58:25 crc kubenswrapper[4845]: E1006 06:58:25.807821 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw" podUID="880d791a-bcb7-4f71-8a16-015bd26af4d9" Oct 06 06:58:25 crc kubenswrapper[4845]: E1006 06:58:25.808613 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" podUID="dcb43cda-fbc6-4092-bf5d-296858e233cd" Oct 06 06:58:25 crc kubenswrapper[4845]: E1006 06:58:25.809031 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" podUID="93be683f-25a1-477e-b676-5bc7be2c3bf8" Oct 06 06:58:25 crc kubenswrapper[4845]: E1006 06:58:25.809163 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" podUID="49f6ba5b-4750-418f-ac64-574d92bf6f61" Oct 06 06:58:25 crc kubenswrapper[4845]: E1006 06:58:25.809771 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" podUID="c921a4a9-e09a-4fd2-965e-c13f7fee169e" Oct 06 06:58:25 crc kubenswrapper[4845]: I1006 06:58:25.876769 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" podStartSLOduration=3.87674926 podStartE2EDuration="3.87674926s" podCreationTimestamp="2025-10-06 06:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 06:58:25.872532034 +0000 UTC m=+790.387273052" watchObservedRunningTime="2025-10-06 06:58:25.87674926 +0000 UTC m=+790.391490268" Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.342063 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lnl58"] Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.351009 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnl58"] Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.351169 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.456927 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-utilities\") pod \"community-operators-lnl58\" (UID: \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\") " pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.456977 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-catalog-content\") pod \"community-operators-lnl58\" (UID: \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\") " pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.457024 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm5fk\" (UniqueName: \"kubernetes.io/projected/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-kube-api-access-rm5fk\") pod \"community-operators-lnl58\" (UID: \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\") " pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.559140 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-utilities\") pod \"community-operators-lnl58\" (UID: \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\") " pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.559205 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-catalog-content\") pod \"community-operators-lnl58\" (UID: \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\") " pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.559262 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm5fk\" (UniqueName: \"kubernetes.io/projected/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-kube-api-access-rm5fk\") pod \"community-operators-lnl58\" (UID: \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\") " pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.560265 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-catalog-content\") pod \"community-operators-lnl58\" (UID: \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\") " pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.560363 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-utilities\") pod \"community-operators-lnl58\" (UID: \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\") " pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.587739 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm5fk\" (UniqueName: \"kubernetes.io/projected/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-kube-api-access-rm5fk\") pod \"community-operators-lnl58\" (UID: \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\") " pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:28 crc kubenswrapper[4845]: I1006 06:58:28.677315 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.198234 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnl58"] Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.333566 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-dlvb8" Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.882152 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg" event={"ID":"01476e52-bab1-4b3b-b2d9-3a9d9469c943","Type":"ContainerStarted","Data":"aac002af1d51ed2cefe423ba905cca3ee7a07b5f3c688b016adb41c0eedb8dfa"} Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.889951 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc" event={"ID":"4629df36-951c-461d-9b54-c69cbec8bcd5","Type":"ContainerStarted","Data":"61f04961e0c69777e23551a54c2e5fa8a5e01f0b84d233ffc737f82b56ce8b07"} Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.898274 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" event={"ID":"1877f632-ca26-4045-a192-08d2f0f97a4e","Type":"ContainerStarted","Data":"e52935e7a8b44d817e9d243d8132f3c7854ef762d3e00256db24394e184d8b04"} Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.900387 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-9v85j" event={"ID":"38e8455f-f063-46aa-8275-20b6d80aa9ea","Type":"ContainerStarted","Data":"365098ceb736d2f8500bc5cd15dae3b7e1aad0929b2440162750c90281e6d662"} Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.904248 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnl58" event={"ID":"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f","Type":"ContainerStarted","Data":"39d4426e45077972673373670255b6446f1252bf3dfc38b42812a2903e343a51"} Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.920187 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" event={"ID":"3f91fed8-7759-443b-869a-886f63b42502","Type":"ContainerStarted","Data":"5f53631d47d4d32817c545da007971c03b2bac876a199bd971bfb25c3af15211"} Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.957745 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb" event={"ID":"2e4d8467-600b-4ae3-8b1a-c4f0416718f7","Type":"ContainerStarted","Data":"edbefbc91fb12368822111cb43bd8ebe266dabdfec94ce94684c81744677d36e"} Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.977745 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b" event={"ID":"5166cd91-6a8c-4b81-b311-2a1e561928d3","Type":"ContainerStarted","Data":"e59af9f97b924fc90c11c0f7a8d3f5b9fab8cfad3df8056af90fec197148137b"} Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.980362 4845 generic.go:334] "Generic (PLEG): container finished" podID="440a58e5-104c-4791-9cba-e62a0afbe4a2" containerID="9524fc40a392690a7848b807187e15e0d57777c0cf6f7ed68c8035bfa951b817" exitCode=0 Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.980458 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfwvb" event={"ID":"440a58e5-104c-4791-9cba-e62a0afbe4a2","Type":"ContainerDied","Data":"9524fc40a392690a7848b807187e15e0d57777c0cf6f7ed68c8035bfa951b817"} Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.990628 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5" event={"ID":"540c7899-a612-4745-8c42-02033d088f73","Type":"ContainerStarted","Data":"8fc2ece8b2ba01a80407cad106fa5d9976a98dd3c6c859bbd1ca0b927c6a7452"} Oct 06 06:58:33 crc kubenswrapper[4845]: I1006 06:58:33.991235 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5" Oct 06 06:58:34 crc kubenswrapper[4845]: I1006 06:58:34.013676 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt" event={"ID":"12269390-1665-4934-8695-eab596535e81","Type":"ContainerStarted","Data":"40bc9f6690cf473bb3948b93ce5d4a156fa8caf99941ddcde3f8117940e7f03f"} Oct 06 06:58:34 crc kubenswrapper[4845]: I1006 06:58:34.020506 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng" event={"ID":"223d7355-4741-4d59-b6cf-e71702ddc20e","Type":"ContainerStarted","Data":"1623833618f94d8fa1f45424abb32435f233ca970b8f8f9f72b7f1d4a0a13583"} Oct 06 06:58:34 crc kubenswrapper[4845]: I1006 06:58:34.021746 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759" event={"ID":"09e0aaab-0038-4f9b-881e-3774781f2825","Type":"ContainerStarted","Data":"8bd3867f49711ce03b8e80829bbe1369aa76a638759da1e1ff66e7e0d32f69f3"} Oct 06 06:58:34 crc kubenswrapper[4845]: I1006 06:58:34.023512 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g" event={"ID":"8342c3c4-6f77-4647-a4c2-9f834c55ee19","Type":"ContainerStarted","Data":"4bfc04c67d214690302857ae8bad8d813a81d1e026d764b2619a7f56acdcf5e3"} Oct 06 06:58:34 crc kubenswrapper[4845]: I1006 06:58:34.025051 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm" event={"ID":"0a59590c-5261-403e-a7e3-0e726a025412","Type":"ContainerStarted","Data":"61d3aecaa81e573b2d1bbf17b6e98bcba85056d33c6f505e51ef7b42de34e5ae"} Oct 06 06:58:34 crc kubenswrapper[4845]: I1006 06:58:34.989877 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5" podStartSLOduration=4.717981983 podStartE2EDuration="13.989860968s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.476668309 +0000 UTC m=+787.991409317" lastFinishedPulling="2025-10-06 06:58:32.748547294 +0000 UTC m=+797.263288302" observedRunningTime="2025-10-06 06:58:34.061345197 +0000 UTC m=+798.576086205" watchObservedRunningTime="2025-10-06 06:58:34.989860968 +0000 UTC m=+799.504601976" Oct 06 06:58:34 crc kubenswrapper[4845]: I1006 06:58:34.990342 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n7qkr"] Oct 06 06:58:34 crc kubenswrapper[4845]: I1006 06:58:34.991645 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.004608 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n7qkr"] Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.033916 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg" event={"ID":"01476e52-bab1-4b3b-b2d9-3a9d9469c943","Type":"ContainerStarted","Data":"57e81ab690d833aeff2a2cee0dc7df5cd290dc4592508196fa844e19e302295a"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.034048 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.035982 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" event={"ID":"1877f632-ca26-4045-a192-08d2f0f97a4e","Type":"ContainerStarted","Data":"5fd8b3716bfc912bd3c2752f478df7a2471e5925cc73d7761138c30495f62fe1"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.036085 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.037498 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-9v85j" event={"ID":"38e8455f-f063-46aa-8275-20b6d80aa9ea","Type":"ContainerStarted","Data":"6391def01f90abebe745f1fa1bcb57faa8f0197c401fb41e255946336ceebd58"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.037621 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-76d5577b-9v85j" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.039055 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n" event={"ID":"c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd","Type":"ContainerStarted","Data":"ea9c011a67ee0a62a93310c9a3f71becbe93bbebbea72d849223e4c3cbf437f1"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.039084 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n" event={"ID":"c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd","Type":"ContainerStarted","Data":"9cecbe5b4a35393497721584c605de82c0fefc783c1ffc97ec834a0aa78de9b7"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.039199 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.040389 4845 generic.go:334] "Generic (PLEG): container finished" podID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" containerID="bf6334b78585957e6df826d9b7da57137e2bf56e53285dc55eb9436cabfe726d" exitCode=0 Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.040415 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnl58" event={"ID":"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f","Type":"ContainerDied","Data":"bf6334b78585957e6df826d9b7da57137e2bf56e53285dc55eb9436cabfe726d"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.042857 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" event={"ID":"3f91fed8-7759-443b-869a-886f63b42502","Type":"ContainerStarted","Data":"634e1c8d3bffb6ead49ac765eb20b6211018390ea25a9c0e732c973e3888f456"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.042984 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.044434 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5" event={"ID":"540c7899-a612-4745-8c42-02033d088f73","Type":"ContainerStarted","Data":"ca7b19821f25fe12717fefb282d4166445b4e38bfa26fcd4696ae7b9abe2ec96"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.045807 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng" event={"ID":"223d7355-4741-4d59-b6cf-e71702ddc20e","Type":"ContainerStarted","Data":"e0569bf48921ec28c59738db543163245027c3687db8dd9d1ff074662e1b8f22"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.046323 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.052587 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc" event={"ID":"4629df36-951c-461d-9b54-c69cbec8bcd5","Type":"ContainerStarted","Data":"5c5e7b359a77f7a93f0483510cb9b721fa34dfcfcd80affffb9a94567ad8f1f2"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.052684 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.062098 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759" event={"ID":"09e0aaab-0038-4f9b-881e-3774781f2825","Type":"ContainerStarted","Data":"4a411cd9319f481a48fbf27a850101442529cd4c342b609f9e77d422e5b08917"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.062919 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.066453 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg" podStartSLOduration=4.672739914 podStartE2EDuration="14.066437225s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.310329312 +0000 UTC m=+787.825070320" lastFinishedPulling="2025-10-06 06:58:32.704026623 +0000 UTC m=+797.218767631" observedRunningTime="2025-10-06 06:58:35.062548847 +0000 UTC m=+799.577289855" watchObservedRunningTime="2025-10-06 06:58:35.066437225 +0000 UTC m=+799.581178233" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.067682 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfwvb" event={"ID":"440a58e5-104c-4791-9cba-e62a0afbe4a2","Type":"ContainerStarted","Data":"511d2440745a81566bd04eee3c63dc3ab3ceac9f0675178ce2423681f7ff5e08"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.074112 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e58f10-5cf5-4a79-8303-bf14cd55f546-utilities\") pod \"certified-operators-n7qkr\" (UID: \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\") " pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.074155 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e58f10-5cf5-4a79-8303-bf14cd55f546-catalog-content\") pod \"certified-operators-n7qkr\" (UID: \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\") " pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.074184 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb4g9\" (UniqueName: \"kubernetes.io/projected/e2e58f10-5cf5-4a79-8303-bf14cd55f546-kube-api-access-wb4g9\") pod \"certified-operators-n7qkr\" (UID: \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\") " pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.077652 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt" event={"ID":"12269390-1665-4934-8695-eab596535e81","Type":"ContainerStarted","Data":"c0f4652e675a8b2b60b55e2053755df329f0fceea294f9667d17a256740cc7e6"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.078160 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.085687 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb" event={"ID":"2e4d8467-600b-4ae3-8b1a-c4f0416718f7","Type":"ContainerStarted","Data":"a0c8fb0268e9eb0c6cdc6c97494b64ef4e7d8ae1e8e1730a96459a980ca6336a"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.086450 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.089539 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759" podStartSLOduration=5.229045826 podStartE2EDuration="14.089523666s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.843444751 +0000 UTC m=+788.358185759" lastFinishedPulling="2025-10-06 06:58:32.703922571 +0000 UTC m=+797.218663599" observedRunningTime="2025-10-06 06:58:35.086765047 +0000 UTC m=+799.601506055" watchObservedRunningTime="2025-10-06 06:58:35.089523666 +0000 UTC m=+799.604264684" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.095754 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g" event={"ID":"8342c3c4-6f77-4647-a4c2-9f834c55ee19","Type":"ContainerStarted","Data":"c4f9a78442fe991f09ed28cd9f1c50e068c7b10c82a064ad8dde02c2f01b5c94"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.096401 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.102668 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b" event={"ID":"5166cd91-6a8c-4b81-b311-2a1e561928d3","Type":"ContainerStarted","Data":"ebb816bcf0b00ec636b3979ab0d438e95e9db9db1b58add0e6a38cc807a72cec"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.103458 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.115544 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm" event={"ID":"0a59590c-5261-403e-a7e3-0e726a025412","Type":"ContainerStarted","Data":"4aae3ee1bc349c4176bbc5ca790e2466fd22f54347166a081200e53d762da726"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.115991 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.125552 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv" event={"ID":"58567cef-7ff1-455c-a1d3-1f7a6f35a504","Type":"ContainerStarted","Data":"ef47caed5653c282c36de77ea4b32afe0e48ffbdaaa05d3793f6551222bb7d4b"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.125590 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv" event={"ID":"58567cef-7ff1-455c-a1d3-1f7a6f35a504","Type":"ContainerStarted","Data":"c15d200387daf617773ee3250195129fccdc9464a96486aa937876480a291695"} Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.126265 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.135160 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc" podStartSLOduration=4.350234056 podStartE2EDuration="14.135143424s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:22.963715978 +0000 UTC m=+787.478456986" lastFinishedPulling="2025-10-06 06:58:32.748625346 +0000 UTC m=+797.263366354" observedRunningTime="2025-10-06 06:58:35.132685373 +0000 UTC m=+799.647426371" watchObservedRunningTime="2025-10-06 06:58:35.135143424 +0000 UTC m=+799.649884432" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.136230 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n" podStartSLOduration=4.725086632 podStartE2EDuration="14.136222182s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.364016724 +0000 UTC m=+787.878757732" lastFinishedPulling="2025-10-06 06:58:32.775152274 +0000 UTC m=+797.289893282" observedRunningTime="2025-10-06 06:58:35.110957346 +0000 UTC m=+799.625698364" watchObservedRunningTime="2025-10-06 06:58:35.136222182 +0000 UTC m=+799.650963190" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.149570 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng" podStartSLOduration=5.263120974 podStartE2EDuration="14.149557727s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.865451605 +0000 UTC m=+788.380192613" lastFinishedPulling="2025-10-06 06:58:32.751888348 +0000 UTC m=+797.266629366" observedRunningTime="2025-10-06 06:58:35.148107101 +0000 UTC m=+799.662848109" watchObservedRunningTime="2025-10-06 06:58:35.149557727 +0000 UTC m=+799.664298725" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.174022 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" podStartSLOduration=5.148714506 podStartE2EDuration="13.174006363s" podCreationTimestamp="2025-10-06 06:58:22 +0000 UTC" firstStartedPulling="2025-10-06 06:58:24.781158244 +0000 UTC m=+789.295899252" lastFinishedPulling="2025-10-06 06:58:32.806450101 +0000 UTC m=+797.321191109" observedRunningTime="2025-10-06 06:58:35.16833583 +0000 UTC m=+799.683076858" watchObservedRunningTime="2025-10-06 06:58:35.174006363 +0000 UTC m=+799.688747361" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.175751 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e58f10-5cf5-4a79-8303-bf14cd55f546-utilities\") pod \"certified-operators-n7qkr\" (UID: \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\") " pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.175808 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e58f10-5cf5-4a79-8303-bf14cd55f546-catalog-content\") pod \"certified-operators-n7qkr\" (UID: \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\") " pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.175835 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb4g9\" (UniqueName: \"kubernetes.io/projected/e2e58f10-5cf5-4a79-8303-bf14cd55f546-kube-api-access-wb4g9\") pod \"certified-operators-n7qkr\" (UID: \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\") " pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.176821 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e58f10-5cf5-4a79-8303-bf14cd55f546-utilities\") pod \"certified-operators-n7qkr\" (UID: \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\") " pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.177047 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e58f10-5cf5-4a79-8303-bf14cd55f546-catalog-content\") pod \"certified-operators-n7qkr\" (UID: \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\") " pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.221405 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb4g9\" (UniqueName: \"kubernetes.io/projected/e2e58f10-5cf5-4a79-8303-bf14cd55f546-kube-api-access-wb4g9\") pod \"certified-operators-n7qkr\" (UID: \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\") " pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.244148 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" podStartSLOduration=4.895624885 podStartE2EDuration="14.244129968s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.401125078 +0000 UTC m=+787.915866086" lastFinishedPulling="2025-10-06 06:58:32.749630161 +0000 UTC m=+797.264371169" observedRunningTime="2025-10-06 06:58:35.212662546 +0000 UTC m=+799.727403554" watchObservedRunningTime="2025-10-06 06:58:35.244129968 +0000 UTC m=+799.758870986" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.248844 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-76d5577b-9v85j" podStartSLOduration=4.27726096 podStartE2EDuration="13.248827566s" podCreationTimestamp="2025-10-06 06:58:22 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.777501341 +0000 UTC m=+788.292242369" lastFinishedPulling="2025-10-06 06:58:32.749067967 +0000 UTC m=+797.263808975" observedRunningTime="2025-10-06 06:58:35.242757263 +0000 UTC m=+799.757498271" watchObservedRunningTime="2025-10-06 06:58:35.248827566 +0000 UTC m=+799.763568574" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.258994 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm" podStartSLOduration=4.799121065 podStartE2EDuration="14.258976301s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.28638773 +0000 UTC m=+787.801128738" lastFinishedPulling="2025-10-06 06:58:32.746242966 +0000 UTC m=+797.260983974" observedRunningTime="2025-10-06 06:58:35.258977811 +0000 UTC m=+799.773718819" watchObservedRunningTime="2025-10-06 06:58:35.258976301 +0000 UTC m=+799.773717309" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.301358 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt" podStartSLOduration=5.44173303 podStartE2EDuration="14.301339498s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.888901285 +0000 UTC m=+788.403642293" lastFinishedPulling="2025-10-06 06:58:32.748507743 +0000 UTC m=+797.263248761" observedRunningTime="2025-10-06 06:58:35.287079059 +0000 UTC m=+799.801820087" watchObservedRunningTime="2025-10-06 06:58:35.301339498 +0000 UTC m=+799.816080536" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.311655 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.320553 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb" podStartSLOduration=4.967862563 podStartE2EDuration="14.320537581s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.395910207 +0000 UTC m=+787.910651215" lastFinishedPulling="2025-10-06 06:58:32.748585215 +0000 UTC m=+797.263326233" observedRunningTime="2025-10-06 06:58:35.316567041 +0000 UTC m=+799.831308049" watchObservedRunningTime="2025-10-06 06:58:35.320537581 +0000 UTC m=+799.835278599" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.348275 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv" podStartSLOduration=5.397934468 podStartE2EDuration="14.348258089s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.795873074 +0000 UTC m=+788.310614072" lastFinishedPulling="2025-10-06 06:58:32.746196665 +0000 UTC m=+797.260937693" observedRunningTime="2025-10-06 06:58:35.336845751 +0000 UTC m=+799.851586759" watchObservedRunningTime="2025-10-06 06:58:35.348258089 +0000 UTC m=+799.862999097" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.364810 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g" podStartSLOduration=4.488665422 podStartE2EDuration="13.364792425s" podCreationTimestamp="2025-10-06 06:58:22 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.875757305 +0000 UTC m=+788.390498313" lastFinishedPulling="2025-10-06 06:58:32.751884298 +0000 UTC m=+797.266625316" observedRunningTime="2025-10-06 06:58:35.361591544 +0000 UTC m=+799.876332552" watchObservedRunningTime="2025-10-06 06:58:35.364792425 +0000 UTC m=+799.879533433" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.403906 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lfwvb" podStartSLOduration=4.466265138 podStartE2EDuration="12.403887889s" podCreationTimestamp="2025-10-06 06:58:23 +0000 UTC" firstStartedPulling="2025-10-06 06:58:26.608337884 +0000 UTC m=+791.123078882" lastFinishedPulling="2025-10-06 06:58:34.545960625 +0000 UTC m=+799.060701633" observedRunningTime="2025-10-06 06:58:35.38050263 +0000 UTC m=+799.895243658" watchObservedRunningTime="2025-10-06 06:58:35.403887889 +0000 UTC m=+799.918628887" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.616193 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b" podStartSLOduration=5.360357602 podStartE2EDuration="14.616177262s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.494088198 +0000 UTC m=+788.008829206" lastFinishedPulling="2025-10-06 06:58:32.749907848 +0000 UTC m=+797.264648866" observedRunningTime="2025-10-06 06:58:35.403600142 +0000 UTC m=+799.918341160" watchObservedRunningTime="2025-10-06 06:58:35.616177262 +0000 UTC m=+800.130918270" Oct 06 06:58:35 crc kubenswrapper[4845]: I1006 06:58:35.620986 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n7qkr"] Oct 06 06:58:36 crc kubenswrapper[4845]: I1006 06:58:36.136297 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnl58" event={"ID":"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f","Type":"ContainerStarted","Data":"19508e4e6b6f63cdf55048175354649ba7b7636716884ebe6ec56c1dbfbb6e98"} Oct 06 06:58:36 crc kubenswrapper[4845]: I1006 06:58:36.138755 4845 generic.go:334] "Generic (PLEG): container finished" podID="e2e58f10-5cf5-4a79-8303-bf14cd55f546" containerID="c31b9d406a577e180c40e450d67000ccf86f7dd9220e9847dcde9e6492cbec9c" exitCode=0 Oct 06 06:58:36 crc kubenswrapper[4845]: I1006 06:58:36.138917 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7qkr" event={"ID":"e2e58f10-5cf5-4a79-8303-bf14cd55f546","Type":"ContainerDied","Data":"c31b9d406a577e180c40e450d67000ccf86f7dd9220e9847dcde9e6492cbec9c"} Oct 06 06:58:36 crc kubenswrapper[4845]: I1006 06:58:36.138973 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7qkr" event={"ID":"e2e58f10-5cf5-4a79-8303-bf14cd55f546","Type":"ContainerStarted","Data":"23d47843e8607317c6c72db60670d40bc3d5b5069807aa6d1d53d37504719b2c"} Oct 06 06:58:37 crc kubenswrapper[4845]: I1006 06:58:37.149947 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7qkr" event={"ID":"e2e58f10-5cf5-4a79-8303-bf14cd55f546","Type":"ContainerStarted","Data":"ea91221c4f94174e372a4e0c18262b3f89531787b56d4dafb4af5b01905079fc"} Oct 06 06:58:37 crc kubenswrapper[4845]: I1006 06:58:37.152765 4845 generic.go:334] "Generic (PLEG): container finished" podID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" containerID="19508e4e6b6f63cdf55048175354649ba7b7636716884ebe6ec56c1dbfbb6e98" exitCode=0 Oct 06 06:58:37 crc kubenswrapper[4845]: I1006 06:58:37.152836 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnl58" event={"ID":"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f","Type":"ContainerDied","Data":"19508e4e6b6f63cdf55048175354649ba7b7636716884ebe6ec56c1dbfbb6e98"} Oct 06 06:58:38 crc kubenswrapper[4845]: I1006 06:58:38.161363 4845 generic.go:334] "Generic (PLEG): container finished" podID="e2e58f10-5cf5-4a79-8303-bf14cd55f546" containerID="ea91221c4f94174e372a4e0c18262b3f89531787b56d4dafb4af5b01905079fc" exitCode=0 Oct 06 06:58:38 crc kubenswrapper[4845]: I1006 06:58:38.161406 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7qkr" event={"ID":"e2e58f10-5cf5-4a79-8303-bf14cd55f546","Type":"ContainerDied","Data":"ea91221c4f94174e372a4e0c18262b3f89531787b56d4dafb4af5b01905079fc"} Oct 06 06:58:38 crc kubenswrapper[4845]: I1006 06:58:38.163911 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" event={"ID":"c21520ff-b41e-4433-8656-d248f0975c60","Type":"ContainerStarted","Data":"7c86604099b0f86dd86584f0f556857c2f8dd6606e54082138101f24812b00fb"} Oct 06 06:58:38 crc kubenswrapper[4845]: I1006 06:58:38.164413 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" Oct 06 06:58:38 crc kubenswrapper[4845]: I1006 06:58:38.166986 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnl58" event={"ID":"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f","Type":"ContainerStarted","Data":"64c01dd0d92e08cd570df321f2033f1439c6bbf8ce0cb92dc20fd6c1f7b28ede"} Oct 06 06:58:38 crc kubenswrapper[4845]: I1006 06:58:38.205226 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lnl58" podStartSLOduration=7.618348836 podStartE2EDuration="10.205212198s" podCreationTimestamp="2025-10-06 06:58:28 +0000 UTC" firstStartedPulling="2025-10-06 06:58:35.04161334 +0000 UTC m=+799.556354348" lastFinishedPulling="2025-10-06 06:58:37.628476702 +0000 UTC m=+802.143217710" observedRunningTime="2025-10-06 06:58:38.204852279 +0000 UTC m=+802.719593287" watchObservedRunningTime="2025-10-06 06:58:38.205212198 +0000 UTC m=+802.719953206" Oct 06 06:58:38 crc kubenswrapper[4845]: I1006 06:58:38.223175 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" podStartSLOduration=2.494196481 podStartE2EDuration="16.22315965s" podCreationTimestamp="2025-10-06 06:58:22 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.898779824 +0000 UTC m=+788.413520832" lastFinishedPulling="2025-10-06 06:58:37.627742993 +0000 UTC m=+802.142484001" observedRunningTime="2025-10-06 06:58:38.218314748 +0000 UTC m=+802.733055756" watchObservedRunningTime="2025-10-06 06:58:38.22315965 +0000 UTC m=+802.737900658" Oct 06 06:58:38 crc kubenswrapper[4845]: I1006 06:58:38.678870 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:38 crc kubenswrapper[4845]: I1006 06:58:38.678969 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:39 crc kubenswrapper[4845]: I1006 06:58:39.175222 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7qkr" event={"ID":"e2e58f10-5cf5-4a79-8303-bf14cd55f546","Type":"ContainerStarted","Data":"757940aaff526fac51932a2f2f430af96f708bd19bfba59b09eeca37297c77b6"} Oct 06 06:58:39 crc kubenswrapper[4845]: I1006 06:58:39.177163 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" event={"ID":"49f6ba5b-4750-418f-ac64-574d92bf6f61","Type":"ContainerStarted","Data":"280e35487bb63e4786b7dfc6b39af9f72827834469376acb55b62b378f4fac0d"} Oct 06 06:58:39 crc kubenswrapper[4845]: I1006 06:58:39.177462 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" Oct 06 06:58:39 crc kubenswrapper[4845]: I1006 06:58:39.193774 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n7qkr" podStartSLOduration=2.362652731 podStartE2EDuration="5.19376121s" podCreationTimestamp="2025-10-06 06:58:34 +0000 UTC" firstStartedPulling="2025-10-06 06:58:36.140577592 +0000 UTC m=+800.655318600" lastFinishedPulling="2025-10-06 06:58:38.971686071 +0000 UTC m=+803.486427079" observedRunningTime="2025-10-06 06:58:39.190663312 +0000 UTC m=+803.705404320" watchObservedRunningTime="2025-10-06 06:58:39.19376121 +0000 UTC m=+803.708502218" Oct 06 06:58:39 crc kubenswrapper[4845]: I1006 06:58:39.208125 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" podStartSLOduration=3.237707064 podStartE2EDuration="18.208106681s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.898547898 +0000 UTC m=+788.413288906" lastFinishedPulling="2025-10-06 06:58:38.868947515 +0000 UTC m=+803.383688523" observedRunningTime="2025-10-06 06:58:39.202792278 +0000 UTC m=+803.717533296" watchObservedRunningTime="2025-10-06 06:58:39.208106681 +0000 UTC m=+803.722847709" Oct 06 06:58:39 crc kubenswrapper[4845]: I1006 06:58:39.723387 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lnl58" podUID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" containerName="registry-server" probeResult="failure" output=< Oct 06 06:58:39 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Oct 06 06:58:39 crc kubenswrapper[4845]: > Oct 06 06:58:41 crc kubenswrapper[4845]: I1006 06:58:41.201608 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw" event={"ID":"880d791a-bcb7-4f71-8a16-015bd26af4d9","Type":"ContainerStarted","Data":"4abd5c7c71cf3fa3577dc2d5cf43edb1484c4036d1e987031302c3273c88e7cf"} Oct 06 06:58:41 crc kubenswrapper[4845]: I1006 06:58:41.221768 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw" podStartSLOduration=2.710848834 podStartE2EDuration="19.221748725s" podCreationTimestamp="2025-10-06 06:58:22 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.898535318 +0000 UTC m=+788.413276326" lastFinishedPulling="2025-10-06 06:58:40.409435209 +0000 UTC m=+804.924176217" observedRunningTime="2025-10-06 06:58:41.214979804 +0000 UTC m=+805.729720832" watchObservedRunningTime="2025-10-06 06:58:41.221748725 +0000 UTC m=+805.736489733" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.053471 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-mvrqc" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.084194 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-hf7mm" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.120564 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-qzbsb" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.181253 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-kdbk5" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.190848 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-nw58b" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.224263 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6675647785-4xx5n" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.256280 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-s8lqw" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.315664 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-lx8hg" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.327066 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-tf759" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.330197 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-6vhjt" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.478480 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-wghng" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.568279 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-t66dv" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.604652 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-h4m5g" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.625979 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-5qmws" Oct 06 06:58:42 crc kubenswrapper[4845]: I1006 06:58:42.676348 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-76d5577b-9v85j" Oct 06 06:58:43 crc kubenswrapper[4845]: I1006 06:58:43.218696 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" event={"ID":"e3721fe9-cb88-4326-aa90-2d08b909515e","Type":"ContainerStarted","Data":"3d489d65a6e9c26afafd5c72f774193a4d91c1edb4ba71ee5e00ceace5aba074"} Oct 06 06:58:43 crc kubenswrapper[4845]: I1006 06:58:43.219105 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" Oct 06 06:58:43 crc kubenswrapper[4845]: I1006 06:58:43.223776 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" event={"ID":"dcb43cda-fbc6-4092-bf5d-296858e233cd","Type":"ContainerStarted","Data":"5f6633337bb51dfb1e303523c5a601343f518f77441a65696ffeed700ac477c5"} Oct 06 06:58:43 crc kubenswrapper[4845]: I1006 06:58:43.224776 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" Oct 06 06:58:43 crc kubenswrapper[4845]: I1006 06:58:43.227042 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" event={"ID":"93be683f-25a1-477e-b676-5bc7be2c3bf8","Type":"ContainerStarted","Data":"72fd91f0178c107a90d83a30c76575776506381dab28e180765501962f8da874"} Oct 06 06:58:43 crc kubenswrapper[4845]: I1006 06:58:43.227227 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" Oct 06 06:58:43 crc kubenswrapper[4845]: I1006 06:58:43.235844 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" podStartSLOduration=3.064926945 podStartE2EDuration="21.235825209s" podCreationTimestamp="2025-10-06 06:58:22 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.888975067 +0000 UTC m=+788.403716075" lastFinishedPulling="2025-10-06 06:58:42.059873331 +0000 UTC m=+806.574614339" observedRunningTime="2025-10-06 06:58:43.232391653 +0000 UTC m=+807.747132671" watchObservedRunningTime="2025-10-06 06:58:43.235825209 +0000 UTC m=+807.750566237" Oct 06 06:58:43 crc kubenswrapper[4845]: I1006 06:58:43.253163 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" podStartSLOduration=4.094868459 podStartE2EDuration="22.253143345s" podCreationTimestamp="2025-10-06 06:58:21 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.8986268 +0000 UTC m=+788.413367808" lastFinishedPulling="2025-10-06 06:58:42.056901686 +0000 UTC m=+806.571642694" observedRunningTime="2025-10-06 06:58:43.251458423 +0000 UTC m=+807.766199451" watchObservedRunningTime="2025-10-06 06:58:43.253143345 +0000 UTC m=+807.767884363" Oct 06 06:58:43 crc kubenswrapper[4845]: I1006 06:58:43.269094 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" podStartSLOduration=3.046280157 podStartE2EDuration="21.269074376s" podCreationTimestamp="2025-10-06 06:58:22 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.899091412 +0000 UTC m=+788.413832420" lastFinishedPulling="2025-10-06 06:58:42.121885631 +0000 UTC m=+806.636626639" observedRunningTime="2025-10-06 06:58:43.266178893 +0000 UTC m=+807.780919911" watchObservedRunningTime="2025-10-06 06:58:43.269074376 +0000 UTC m=+807.783815384" Oct 06 06:58:44 crc kubenswrapper[4845]: I1006 06:58:44.085955 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d582qhb" Oct 06 06:58:44 crc kubenswrapper[4845]: I1006 06:58:44.318103 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:44 crc kubenswrapper[4845]: I1006 06:58:44.321042 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:44 crc kubenswrapper[4845]: I1006 06:58:44.362331 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:45 crc kubenswrapper[4845]: I1006 06:58:45.300544 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:45 crc kubenswrapper[4845]: I1006 06:58:45.313074 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:45 crc kubenswrapper[4845]: I1006 06:58:45.313359 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:45 crc kubenswrapper[4845]: I1006 06:58:45.348581 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfwvb"] Oct 06 06:58:45 crc kubenswrapper[4845]: I1006 06:58:45.379698 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:46 crc kubenswrapper[4845]: I1006 06:58:46.292298 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:47 crc kubenswrapper[4845]: I1006 06:58:47.118715 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n7qkr"] Oct 06 06:58:47 crc kubenswrapper[4845]: I1006 06:58:47.256429 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lfwvb" podUID="440a58e5-104c-4791-9cba-e62a0afbe4a2" containerName="registry-server" containerID="cri-o://511d2440745a81566bd04eee3c63dc3ab3ceac9f0675178ce2423681f7ff5e08" gracePeriod=2 Oct 06 06:58:48 crc kubenswrapper[4845]: I1006 06:58:48.268055 4845 generic.go:334] "Generic (PLEG): container finished" podID="440a58e5-104c-4791-9cba-e62a0afbe4a2" containerID="511d2440745a81566bd04eee3c63dc3ab3ceac9f0675178ce2423681f7ff5e08" exitCode=0 Oct 06 06:58:48 crc kubenswrapper[4845]: I1006 06:58:48.268126 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfwvb" event={"ID":"440a58e5-104c-4791-9cba-e62a0afbe4a2","Type":"ContainerDied","Data":"511d2440745a81566bd04eee3c63dc3ab3ceac9f0675178ce2423681f7ff5e08"} Oct 06 06:58:48 crc kubenswrapper[4845]: I1006 06:58:48.268475 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n7qkr" podUID="e2e58f10-5cf5-4a79-8303-bf14cd55f546" containerName="registry-server" containerID="cri-o://757940aaff526fac51932a2f2f430af96f708bd19bfba59b09eeca37297c77b6" gracePeriod=2 Oct 06 06:58:48 crc kubenswrapper[4845]: I1006 06:58:48.747530 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:48 crc kubenswrapper[4845]: I1006 06:58:48.811846 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:50 crc kubenswrapper[4845]: I1006 06:58:50.286775 4845 generic.go:334] "Generic (PLEG): container finished" podID="e2e58f10-5cf5-4a79-8303-bf14cd55f546" containerID="757940aaff526fac51932a2f2f430af96f708bd19bfba59b09eeca37297c77b6" exitCode=0 Oct 06 06:58:50 crc kubenswrapper[4845]: I1006 06:58:50.286908 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7qkr" event={"ID":"e2e58f10-5cf5-4a79-8303-bf14cd55f546","Type":"ContainerDied","Data":"757940aaff526fac51932a2f2f430af96f708bd19bfba59b09eeca37297c77b6"} Oct 06 06:58:50 crc kubenswrapper[4845]: I1006 06:58:50.732467 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnl58"] Oct 06 06:58:50 crc kubenswrapper[4845]: I1006 06:58:50.732943 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lnl58" podUID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" containerName="registry-server" containerID="cri-o://64c01dd0d92e08cd570df321f2033f1439c6bbf8ce0cb92dc20fd6c1f7b28ede" gracePeriod=2 Oct 06 06:58:51 crc kubenswrapper[4845]: I1006 06:58:51.299958 4845 generic.go:334] "Generic (PLEG): container finished" podID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" containerID="64c01dd0d92e08cd570df321f2033f1439c6bbf8ce0cb92dc20fd6c1f7b28ede" exitCode=0 Oct 06 06:58:51 crc kubenswrapper[4845]: I1006 06:58:51.300161 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnl58" event={"ID":"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f","Type":"ContainerDied","Data":"64c01dd0d92e08cd570df321f2033f1439c6bbf8ce0cb92dc20fd6c1f7b28ede"} Oct 06 06:58:51 crc kubenswrapper[4845]: I1006 06:58:51.819504 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:51 crc kubenswrapper[4845]: I1006 06:58:51.861989 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:51 crc kubenswrapper[4845]: I1006 06:58:51.874558 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:51 crc kubenswrapper[4845]: I1006 06:58:51.939208 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-catalog-content\") pod \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\" (UID: \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\") " Oct 06 06:58:51 crc kubenswrapper[4845]: I1006 06:58:51.939287 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-utilities\") pod \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\" (UID: \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\") " Oct 06 06:58:51 crc kubenswrapper[4845]: I1006 06:58:51.939463 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm5fk\" (UniqueName: \"kubernetes.io/projected/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-kube-api-access-rm5fk\") pod \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\" (UID: \"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f\") " Oct 06 06:58:51 crc kubenswrapper[4845]: I1006 06:58:51.942040 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-utilities" (OuterVolumeSpecName: "utilities") pod "5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" (UID: "5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:58:51 crc kubenswrapper[4845]: I1006 06:58:51.948558 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-kube-api-access-rm5fk" (OuterVolumeSpecName: "kube-api-access-rm5fk") pod "5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" (UID: "5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f"). InnerVolumeSpecName "kube-api-access-rm5fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.002802 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" (UID: "5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.040600 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e58f10-5cf5-4a79-8303-bf14cd55f546-catalog-content\") pod \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\" (UID: \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\") " Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.040736 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb4g9\" (UniqueName: \"kubernetes.io/projected/e2e58f10-5cf5-4a79-8303-bf14cd55f546-kube-api-access-wb4g9\") pod \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\" (UID: \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\") " Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.040785 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e58f10-5cf5-4a79-8303-bf14cd55f546-utilities\") pod \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\" (UID: \"e2e58f10-5cf5-4a79-8303-bf14cd55f546\") " Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.040867 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440a58e5-104c-4791-9cba-e62a0afbe4a2-utilities\") pod \"440a58e5-104c-4791-9cba-e62a0afbe4a2\" (UID: \"440a58e5-104c-4791-9cba-e62a0afbe4a2\") " Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.041711 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440a58e5-104c-4791-9cba-e62a0afbe4a2-utilities" (OuterVolumeSpecName: "utilities") pod "440a58e5-104c-4791-9cba-e62a0afbe4a2" (UID: "440a58e5-104c-4791-9cba-e62a0afbe4a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.042069 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e58f10-5cf5-4a79-8303-bf14cd55f546-utilities" (OuterVolumeSpecName: "utilities") pod "e2e58f10-5cf5-4a79-8303-bf14cd55f546" (UID: "e2e58f10-5cf5-4a79-8303-bf14cd55f546"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.042365 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9p62\" (UniqueName: \"kubernetes.io/projected/440a58e5-104c-4791-9cba-e62a0afbe4a2-kube-api-access-n9p62\") pod \"440a58e5-104c-4791-9cba-e62a0afbe4a2\" (UID: \"440a58e5-104c-4791-9cba-e62a0afbe4a2\") " Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.042894 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440a58e5-104c-4791-9cba-e62a0afbe4a2-catalog-content\") pod \"440a58e5-104c-4791-9cba-e62a0afbe4a2\" (UID: \"440a58e5-104c-4791-9cba-e62a0afbe4a2\") " Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.043522 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.043549 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440a58e5-104c-4791-9cba-e62a0afbe4a2-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.043564 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm5fk\" (UniqueName: \"kubernetes.io/projected/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-kube-api-access-rm5fk\") on node \"crc\" DevicePath \"\"" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.043581 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.043594 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e58f10-5cf5-4a79-8303-bf14cd55f546-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.045684 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e58f10-5cf5-4a79-8303-bf14cd55f546-kube-api-access-wb4g9" (OuterVolumeSpecName: "kube-api-access-wb4g9") pod "e2e58f10-5cf5-4a79-8303-bf14cd55f546" (UID: "e2e58f10-5cf5-4a79-8303-bf14cd55f546"). InnerVolumeSpecName "kube-api-access-wb4g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.046298 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/440a58e5-104c-4791-9cba-e62a0afbe4a2-kube-api-access-n9p62" (OuterVolumeSpecName: "kube-api-access-n9p62") pod "440a58e5-104c-4791-9cba-e62a0afbe4a2" (UID: "440a58e5-104c-4791-9cba-e62a0afbe4a2"). InnerVolumeSpecName "kube-api-access-n9p62". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.055183 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440a58e5-104c-4791-9cba-e62a0afbe4a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "440a58e5-104c-4791-9cba-e62a0afbe4a2" (UID: "440a58e5-104c-4791-9cba-e62a0afbe4a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.083244 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e58f10-5cf5-4a79-8303-bf14cd55f546-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2e58f10-5cf5-4a79-8303-bf14cd55f546" (UID: "e2e58f10-5cf5-4a79-8303-bf14cd55f546"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.144290 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e58f10-5cf5-4a79-8303-bf14cd55f546-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.144325 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb4g9\" (UniqueName: \"kubernetes.io/projected/e2e58f10-5cf5-4a79-8303-bf14cd55f546-kube-api-access-wb4g9\") on node \"crc\" DevicePath \"\"" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.144339 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9p62\" (UniqueName: \"kubernetes.io/projected/440a58e5-104c-4791-9cba-e62a0afbe4a2-kube-api-access-n9p62\") on node \"crc\" DevicePath \"\"" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.144348 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440a58e5-104c-4791-9cba-e62a0afbe4a2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.310579 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" event={"ID":"c921a4a9-e09a-4fd2-965e-c13f7fee169e","Type":"ContainerStarted","Data":"c46e4a09f6528d753d108a8de1cf2666bb08e064bd05066cf829f29b3ea268d3"} Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.311878 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.316258 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n7qkr" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.316252 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n7qkr" event={"ID":"e2e58f10-5cf5-4a79-8303-bf14cd55f546","Type":"ContainerDied","Data":"23d47843e8607317c6c72db60670d40bc3d5b5069807aa6d1d53d37504719b2c"} Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.316456 4845 scope.go:117] "RemoveContainer" containerID="757940aaff526fac51932a2f2f430af96f708bd19bfba59b09eeca37297c77b6" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.320449 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnl58" event={"ID":"5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f","Type":"ContainerDied","Data":"39d4426e45077972673373670255b6446f1252bf3dfc38b42812a2903e343a51"} Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.320467 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnl58" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.325219 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfwvb" event={"ID":"440a58e5-104c-4791-9cba-e62a0afbe4a2","Type":"ContainerDied","Data":"5f3d65b1f61b92de5ee13daef0a3c6d14de822d0f35aa6f46b7d008448f64fbc"} Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.325362 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lfwvb" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.356462 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" podStartSLOduration=2.812289438 podStartE2EDuration="30.356439468s" podCreationTimestamp="2025-10-06 06:58:22 +0000 UTC" firstStartedPulling="2025-10-06 06:58:23.897584184 +0000 UTC m=+788.412325192" lastFinishedPulling="2025-10-06 06:58:51.441734194 +0000 UTC m=+815.956475222" observedRunningTime="2025-10-06 06:58:52.33826987 +0000 UTC m=+816.853010878" watchObservedRunningTime="2025-10-06 06:58:52.356439468 +0000 UTC m=+816.871180486" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.356672 4845 scope.go:117] "RemoveContainer" containerID="ea91221c4f94174e372a4e0c18262b3f89531787b56d4dafb4af5b01905079fc" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.358924 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfwvb"] Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.371591 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfwvb"] Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.383428 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n7qkr"] Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.387523 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n7qkr"] Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.399590 4845 scope.go:117] "RemoveContainer" containerID="c31b9d406a577e180c40e450d67000ccf86f7dd9220e9847dcde9e6492cbec9c" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.403930 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnl58"] Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.410252 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lnl58"] Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.421948 4845 scope.go:117] "RemoveContainer" containerID="64c01dd0d92e08cd570df321f2033f1439c6bbf8ce0cb92dc20fd6c1f7b28ede" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.452608 4845 scope.go:117] "RemoveContainer" containerID="19508e4e6b6f63cdf55048175354649ba7b7636716884ebe6ec56c1dbfbb6e98" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.485081 4845 scope.go:117] "RemoveContainer" containerID="bf6334b78585957e6df826d9b7da57137e2bf56e53285dc55eb9436cabfe726d" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.506396 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-c4htj" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.544934 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-wrcqz" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.580725 4845 scope.go:117] "RemoveContainer" containerID="511d2440745a81566bd04eee3c63dc3ab3ceac9f0675178ce2423681f7ff5e08" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.612489 4845 scope.go:117] "RemoveContainer" containerID="9524fc40a392690a7848b807187e15e0d57777c0cf6f7ed68c8035bfa951b817" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.629756 4845 scope.go:117] "RemoveContainer" containerID="48a55ba6250bd8daf8eb46bd619f6fadb520fe62add300a9dac3944826359b71" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.721591 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-c2mlw" Oct 06 06:58:52 crc kubenswrapper[4845]: I1006 06:58:52.936208 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-mj4wz" Oct 06 06:58:54 crc kubenswrapper[4845]: I1006 06:58:54.238002 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="440a58e5-104c-4791-9cba-e62a0afbe4a2" path="/var/lib/kubelet/pods/440a58e5-104c-4791-9cba-e62a0afbe4a2/volumes" Oct 06 06:58:54 crc kubenswrapper[4845]: I1006 06:58:54.239280 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" path="/var/lib/kubelet/pods/5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f/volumes" Oct 06 06:58:54 crc kubenswrapper[4845]: I1006 06:58:54.240188 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e58f10-5cf5-4a79-8303-bf14cd55f546" path="/var/lib/kubelet/pods/e2e58f10-5cf5-4a79-8303-bf14cd55f546/volumes" Oct 06 06:59:02 crc kubenswrapper[4845]: I1006 06:59:02.791496 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-nsmkl" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.973708 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6db48c6849-5wxlp"] Oct 06 06:59:20 crc kubenswrapper[4845]: E1006 06:59:20.974656 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" containerName="extract-content" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.974674 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" containerName="extract-content" Oct 06 06:59:20 crc kubenswrapper[4845]: E1006 06:59:20.974697 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e58f10-5cf5-4a79-8303-bf14cd55f546" containerName="extract-utilities" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.974707 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e58f10-5cf5-4a79-8303-bf14cd55f546" containerName="extract-utilities" Oct 06 06:59:20 crc kubenswrapper[4845]: E1006 06:59:20.974748 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440a58e5-104c-4791-9cba-e62a0afbe4a2" containerName="registry-server" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.974756 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="440a58e5-104c-4791-9cba-e62a0afbe4a2" containerName="registry-server" Oct 06 06:59:20 crc kubenswrapper[4845]: E1006 06:59:20.974773 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440a58e5-104c-4791-9cba-e62a0afbe4a2" containerName="extract-content" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.974781 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="440a58e5-104c-4791-9cba-e62a0afbe4a2" containerName="extract-content" Oct 06 06:59:20 crc kubenswrapper[4845]: E1006 06:59:20.974799 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" containerName="registry-server" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.974807 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" containerName="registry-server" Oct 06 06:59:20 crc kubenswrapper[4845]: E1006 06:59:20.974823 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e58f10-5cf5-4a79-8303-bf14cd55f546" containerName="registry-server" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.974832 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e58f10-5cf5-4a79-8303-bf14cd55f546" containerName="registry-server" Oct 06 06:59:20 crc kubenswrapper[4845]: E1006 06:59:20.974850 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440a58e5-104c-4791-9cba-e62a0afbe4a2" containerName="extract-utilities" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.974860 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="440a58e5-104c-4791-9cba-e62a0afbe4a2" containerName="extract-utilities" Oct 06 06:59:20 crc kubenswrapper[4845]: E1006 06:59:20.974873 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e58f10-5cf5-4a79-8303-bf14cd55f546" containerName="extract-content" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.974880 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e58f10-5cf5-4a79-8303-bf14cd55f546" containerName="extract-content" Oct 06 06:59:20 crc kubenswrapper[4845]: E1006 06:59:20.974892 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" containerName="extract-utilities" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.974899 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" containerName="extract-utilities" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.975057 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e58f10-5cf5-4a79-8303-bf14cd55f546" containerName="registry-server" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.975092 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d52dbbc-6e7a-4127-b63c-bee8e4dbf79f" containerName="registry-server" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.975102 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="440a58e5-104c-4791-9cba-e62a0afbe4a2" containerName="registry-server" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.976015 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.978363 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.978854 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-x8mfw" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.979317 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c48b55-c500-4427-a66e-ddaa189a08af-config\") pod \"dnsmasq-dns-6db48c6849-5wxlp\" (UID: \"e6c48b55-c500-4427-a66e-ddaa189a08af\") " pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.979427 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6c9l\" (UniqueName: \"kubernetes.io/projected/e6c48b55-c500-4427-a66e-ddaa189a08af-kube-api-access-f6c9l\") pod \"dnsmasq-dns-6db48c6849-5wxlp\" (UID: \"e6c48b55-c500-4427-a66e-ddaa189a08af\") " pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.980163 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.987052 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6db48c6849-5wxlp"] Oct 06 06:59:20 crc kubenswrapper[4845]: I1006 06:59:20.988036 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.064919 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9d7c66485-5k26z"] Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.068389 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.073862 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.076403 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d7c66485-5k26z"] Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.081081 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8pz5\" (UniqueName: \"kubernetes.io/projected/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-kube-api-access-d8pz5\") pod \"dnsmasq-dns-9d7c66485-5k26z\" (UID: \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\") " pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.081155 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-config\") pod \"dnsmasq-dns-9d7c66485-5k26z\" (UID: \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\") " pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.081201 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6c9l\" (UniqueName: \"kubernetes.io/projected/e6c48b55-c500-4427-a66e-ddaa189a08af-kube-api-access-f6c9l\") pod \"dnsmasq-dns-6db48c6849-5wxlp\" (UID: \"e6c48b55-c500-4427-a66e-ddaa189a08af\") " pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.081346 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-dns-svc\") pod \"dnsmasq-dns-9d7c66485-5k26z\" (UID: \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\") " pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.081461 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c48b55-c500-4427-a66e-ddaa189a08af-config\") pod \"dnsmasq-dns-6db48c6849-5wxlp\" (UID: \"e6c48b55-c500-4427-a66e-ddaa189a08af\") " pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.082244 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c48b55-c500-4427-a66e-ddaa189a08af-config\") pod \"dnsmasq-dns-6db48c6849-5wxlp\" (UID: \"e6c48b55-c500-4427-a66e-ddaa189a08af\") " pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.107654 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6c9l\" (UniqueName: \"kubernetes.io/projected/e6c48b55-c500-4427-a66e-ddaa189a08af-kube-api-access-f6c9l\") pod \"dnsmasq-dns-6db48c6849-5wxlp\" (UID: \"e6c48b55-c500-4427-a66e-ddaa189a08af\") " pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.183159 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-dns-svc\") pod \"dnsmasq-dns-9d7c66485-5k26z\" (UID: \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\") " pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.183365 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8pz5\" (UniqueName: \"kubernetes.io/projected/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-kube-api-access-d8pz5\") pod \"dnsmasq-dns-9d7c66485-5k26z\" (UID: \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\") " pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.183438 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-config\") pod \"dnsmasq-dns-9d7c66485-5k26z\" (UID: \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\") " pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.184119 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-dns-svc\") pod \"dnsmasq-dns-9d7c66485-5k26z\" (UID: \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\") " pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.184182 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-config\") pod \"dnsmasq-dns-9d7c66485-5k26z\" (UID: \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\") " pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.201290 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8pz5\" (UniqueName: \"kubernetes.io/projected/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-kube-api-access-d8pz5\") pod \"dnsmasq-dns-9d7c66485-5k26z\" (UID: \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\") " pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.302128 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.395885 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.784837 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6db48c6849-5wxlp"] Oct 06 06:59:21 crc kubenswrapper[4845]: I1006 06:59:21.868814 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d7c66485-5k26z"] Oct 06 06:59:21 crc kubenswrapper[4845]: W1006 06:59:21.871850 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d353f56_d5d7_41c5_941a_fa4dd569b3d8.slice/crio-8bcf38cde8ae3069f932f1b04c5d2c24efc1b4efd9e314cfdf0a24350aec905f WatchSource:0}: Error finding container 8bcf38cde8ae3069f932f1b04c5d2c24efc1b4efd9e314cfdf0a24350aec905f: Status 404 returned error can't find the container with id 8bcf38cde8ae3069f932f1b04c5d2c24efc1b4efd9e314cfdf0a24350aec905f Oct 06 06:59:22 crc kubenswrapper[4845]: I1006 06:59:22.549464 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d7c66485-5k26z" event={"ID":"6d353f56-d5d7-41c5-941a-fa4dd569b3d8","Type":"ContainerStarted","Data":"8bcf38cde8ae3069f932f1b04c5d2c24efc1b4efd9e314cfdf0a24350aec905f"} Oct 06 06:59:22 crc kubenswrapper[4845]: I1006 06:59:22.550684 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" event={"ID":"e6c48b55-c500-4427-a66e-ddaa189a08af","Type":"ContainerStarted","Data":"5e1dbc4241db30e914d85b5a6f1d4b69ab850a9c9bad00ffee2bee99c743ab29"} Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.242753 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6db48c6849-5wxlp"] Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.269283 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-689c78bb4c-mlcf4"] Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.270584 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.275575 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689c78bb4c-mlcf4"] Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.433999 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvqrr\" (UniqueName: \"kubernetes.io/projected/199df9d9-f617-44a5-8afe-9e5d086c249b-kube-api-access-rvqrr\") pod \"dnsmasq-dns-689c78bb4c-mlcf4\" (UID: \"199df9d9-f617-44a5-8afe-9e5d086c249b\") " pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.434048 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199df9d9-f617-44a5-8afe-9e5d086c249b-config\") pod \"dnsmasq-dns-689c78bb4c-mlcf4\" (UID: \"199df9d9-f617-44a5-8afe-9e5d086c249b\") " pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.434111 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/199df9d9-f617-44a5-8afe-9e5d086c249b-dns-svc\") pod \"dnsmasq-dns-689c78bb4c-mlcf4\" (UID: \"199df9d9-f617-44a5-8afe-9e5d086c249b\") " pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.534984 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvqrr\" (UniqueName: \"kubernetes.io/projected/199df9d9-f617-44a5-8afe-9e5d086c249b-kube-api-access-rvqrr\") pod \"dnsmasq-dns-689c78bb4c-mlcf4\" (UID: \"199df9d9-f617-44a5-8afe-9e5d086c249b\") " pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.535030 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199df9d9-f617-44a5-8afe-9e5d086c249b-config\") pod \"dnsmasq-dns-689c78bb4c-mlcf4\" (UID: \"199df9d9-f617-44a5-8afe-9e5d086c249b\") " pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.535105 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/199df9d9-f617-44a5-8afe-9e5d086c249b-dns-svc\") pod \"dnsmasq-dns-689c78bb4c-mlcf4\" (UID: \"199df9d9-f617-44a5-8afe-9e5d086c249b\") " pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.536023 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/199df9d9-f617-44a5-8afe-9e5d086c249b-dns-svc\") pod \"dnsmasq-dns-689c78bb4c-mlcf4\" (UID: \"199df9d9-f617-44a5-8afe-9e5d086c249b\") " pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.536090 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199df9d9-f617-44a5-8afe-9e5d086c249b-config\") pod \"dnsmasq-dns-689c78bb4c-mlcf4\" (UID: \"199df9d9-f617-44a5-8afe-9e5d086c249b\") " pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.557766 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvqrr\" (UniqueName: \"kubernetes.io/projected/199df9d9-f617-44a5-8afe-9e5d086c249b-kube-api-access-rvqrr\") pod \"dnsmasq-dns-689c78bb4c-mlcf4\" (UID: \"199df9d9-f617-44a5-8afe-9e5d086c249b\") " pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.559654 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d7c66485-5k26z"] Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.591877 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.592657 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7495cbc78c-v5dbv"] Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.594586 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.606871 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7495cbc78c-v5dbv"] Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.639388 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-config\") pod \"dnsmasq-dns-7495cbc78c-v5dbv\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.639481 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-dns-svc\") pod \"dnsmasq-dns-7495cbc78c-v5dbv\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.639504 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfsn4\" (UniqueName: \"kubernetes.io/projected/a12043d8-5d3d-4eb7-918e-c8b620d880ca-kube-api-access-rfsn4\") pod \"dnsmasq-dns-7495cbc78c-v5dbv\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.743057 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-config\") pod \"dnsmasq-dns-7495cbc78c-v5dbv\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.743166 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-dns-svc\") pod \"dnsmasq-dns-7495cbc78c-v5dbv\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.743193 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfsn4\" (UniqueName: \"kubernetes.io/projected/a12043d8-5d3d-4eb7-918e-c8b620d880ca-kube-api-access-rfsn4\") pod \"dnsmasq-dns-7495cbc78c-v5dbv\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.744549 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-config\") pod \"dnsmasq-dns-7495cbc78c-v5dbv\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.745702 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-dns-svc\") pod \"dnsmasq-dns-7495cbc78c-v5dbv\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.806415 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfsn4\" (UniqueName: \"kubernetes.io/projected/a12043d8-5d3d-4eb7-918e-c8b620d880ca-kube-api-access-rfsn4\") pod \"dnsmasq-dns-7495cbc78c-v5dbv\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 06:59:24 crc kubenswrapper[4845]: I1006 06:59:24.925009 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.218233 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689c78bb4c-mlcf4"] Oct 06 06:59:25 crc kubenswrapper[4845]: W1006 06:59:25.229654 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod199df9d9_f617_44a5_8afe_9e5d086c249b.slice/crio-a1cd41b7ab2bcde44012e083380cebac8833554c79e7a853d96fcd9a5b989777 WatchSource:0}: Error finding container a1cd41b7ab2bcde44012e083380cebac8833554c79e7a853d96fcd9a5b989777: Status 404 returned error can't find the container with id a1cd41b7ab2bcde44012e083380cebac8833554c79e7a853d96fcd9a5b989777 Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.233646 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.399455 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.401948 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.406509 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.407861 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.408097 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.408175 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.408233 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.408361 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.414175 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.414812 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-g5scq" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.475331 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7495cbc78c-v5dbv"] Oct 06 06:59:25 crc kubenswrapper[4845]: W1006 06:59:25.482649 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda12043d8_5d3d_4eb7_918e_c8b620d880ca.slice/crio-edd4e0e13e0a3e51cb285f33d1eff520c280a600656b035e41a8f24f667cbc71 WatchSource:0}: Error finding container edd4e0e13e0a3e51cb285f33d1eff520c280a600656b035e41a8f24f667cbc71: Status 404 returned error can't find the container with id edd4e0e13e0a3e51cb285f33d1eff520c280a600656b035e41a8f24f667cbc71 Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.560829 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38d9a5cf-6de3-487c-a71c-374ca55ca525-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.561134 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.561189 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.561244 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.561274 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.561327 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.561348 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.564031 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.564104 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.564211 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38d9a5cf-6de3-487c-a71c-374ca55ca525-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.564256 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbp69\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-kube-api-access-mbp69\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.614773 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" event={"ID":"199df9d9-f617-44a5-8afe-9e5d086c249b","Type":"ContainerStarted","Data":"a1cd41b7ab2bcde44012e083380cebac8833554c79e7a853d96fcd9a5b989777"} Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.616998 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" event={"ID":"a12043d8-5d3d-4eb7-918e-c8b620d880ca","Type":"ContainerStarted","Data":"edd4e0e13e0a3e51cb285f33d1eff520c280a600656b035e41a8f24f667cbc71"} Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.665273 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38d9a5cf-6de3-487c-a71c-374ca55ca525-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.665326 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.665362 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.665412 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.665437 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.665458 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.665477 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.665501 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.665548 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.665567 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38d9a5cf-6de3-487c-a71c-374ca55ca525-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.665600 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbp69\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-kube-api-access-mbp69\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.666446 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.666637 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.666747 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.667722 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.667889 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.668290 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.677126 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38d9a5cf-6de3-487c-a71c-374ca55ca525-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.677240 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.678876 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38d9a5cf-6de3-487c-a71c-374ca55ca525-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.691118 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.691315 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbp69\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-kube-api-access-mbp69\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.700857 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.717746 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.719530 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.723088 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.723204 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.723098 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.726678 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.729398 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.730039 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x6g58" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.730078 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.730237 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.730270 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.868295 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-config-data\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.868402 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.868427 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drdm6\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-kube-api-access-drdm6\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.868447 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.868479 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.868524 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.868558 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.868596 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.868620 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.868636 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.868653 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.969718 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.970232 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.970280 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.970400 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.970418 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.970525 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-config-data\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.971362 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.972424 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.972456 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drdm6\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-kube-api-access-drdm6\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.972482 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.972540 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.972584 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.972635 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.972644 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.972761 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-config-data\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.973823 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.974039 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.979035 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.983502 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.982529 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.983989 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:25 crc kubenswrapper[4845]: I1006 06:59:25.997204 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drdm6\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-kube-api-access-drdm6\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:26 crc kubenswrapper[4845]: I1006 06:59:26.004185 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " pod="openstack/rabbitmq-server-0" Oct 06 06:59:26 crc kubenswrapper[4845]: I1006 06:59:26.088278 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 06:59:26 crc kubenswrapper[4845]: I1006 06:59:26.217063 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 06:59:27 crc kubenswrapper[4845]: I1006 06:59:27.130118 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 06:59:27 crc kubenswrapper[4845]: I1006 06:59:27.634030 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a669571-1ec3-4cb8-8a07-e20c31ca87e5","Type":"ContainerStarted","Data":"2d3d3c7fca4a33cfadbaec6a6408b923aa90ada90d3090433c20bdae3e5822d9"} Oct 06 06:59:27 crc kubenswrapper[4845]: I1006 06:59:27.636843 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38d9a5cf-6de3-487c-a71c-374ca55ca525","Type":"ContainerStarted","Data":"a93c317988726ae1e9e51b745f4d50bfe2e532e35a9443aada242cd1e959b99c"} Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.292853 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.296614 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.301740 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.303644 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.304076 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.304297 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.305938 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-7x2pz" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.306269 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.314045 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.418111 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.419286 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.421775 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.421941 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8xjk6" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.421971 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.422173 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.433398 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8d5452f2-c63d-4287-93c4-17b89651a7c1-secrets\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.433439 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d5452f2-c63d-4287-93c4-17b89651a7c1-config-data-default\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.433458 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d5452f2-c63d-4287-93c4-17b89651a7c1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.433478 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d5452f2-c63d-4287-93c4-17b89651a7c1-kolla-config\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.433497 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5452f2-c63d-4287-93c4-17b89651a7c1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.433513 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d5452f2-c63d-4287-93c4-17b89651a7c1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.433554 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.433591 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d5452f2-c63d-4287-93c4-17b89651a7c1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.433621 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n878x\" (UniqueName: \"kubernetes.io/projected/8d5452f2-c63d-4287-93c4-17b89651a7c1-kube-api-access-n878x\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.434313 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.535734 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/441b3c5d-0205-472b-8356-e10a4b5b3a4a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.535807 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d5452f2-c63d-4287-93c4-17b89651a7c1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.535866 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n878x\" (UniqueName: \"kubernetes.io/projected/8d5452f2-c63d-4287-93c4-17b89651a7c1-kube-api-access-n878x\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.535889 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89mh8\" (UniqueName: \"kubernetes.io/projected/441b3c5d-0205-472b-8356-e10a4b5b3a4a-kube-api-access-89mh8\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.535928 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8d5452f2-c63d-4287-93c4-17b89651a7c1-secrets\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.535946 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.535977 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d5452f2-c63d-4287-93c4-17b89651a7c1-config-data-default\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.535998 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d5452f2-c63d-4287-93c4-17b89651a7c1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.536029 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d5452f2-c63d-4287-93c4-17b89651a7c1-kolla-config\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.536060 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d5452f2-c63d-4287-93c4-17b89651a7c1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.536077 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5452f2-c63d-4287-93c4-17b89651a7c1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.536096 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/441b3c5d-0205-472b-8356-e10a4b5b3a4a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.536127 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/441b3c5d-0205-472b-8356-e10a4b5b3a4a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.536154 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/441b3c5d-0205-472b-8356-e10a4b5b3a4a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.536176 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/441b3c5d-0205-472b-8356-e10a4b5b3a4a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.536205 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.536227 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/441b3c5d-0205-472b-8356-e10a4b5b3a4a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.536248 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/441b3c5d-0205-472b-8356-e10a4b5b3a4a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.537124 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.537167 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8d5452f2-c63d-4287-93c4-17b89651a7c1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.538150 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8d5452f2-c63d-4287-93c4-17b89651a7c1-kolla-config\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.538582 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8d5452f2-c63d-4287-93c4-17b89651a7c1-config-data-default\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.540444 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d5452f2-c63d-4287-93c4-17b89651a7c1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.544217 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d5452f2-c63d-4287-93c4-17b89651a7c1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.550210 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8d5452f2-c63d-4287-93c4-17b89651a7c1-secrets\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.556498 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5452f2-c63d-4287-93c4-17b89651a7c1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.582813 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.596259 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n878x\" (UniqueName: \"kubernetes.io/projected/8d5452f2-c63d-4287-93c4-17b89651a7c1-kube-api-access-n878x\") pod \"openstack-galera-0\" (UID: \"8d5452f2-c63d-4287-93c4-17b89651a7c1\") " pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.629435 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.641043 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.641144 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/441b3c5d-0205-472b-8356-e10a4b5b3a4a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.641189 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/441b3c5d-0205-472b-8356-e10a4b5b3a4a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.641223 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/441b3c5d-0205-472b-8356-e10a4b5b3a4a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.641249 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/441b3c5d-0205-472b-8356-e10a4b5b3a4a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.641293 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/441b3c5d-0205-472b-8356-e10a4b5b3a4a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.641329 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/441b3c5d-0205-472b-8356-e10a4b5b3a4a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.641396 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/441b3c5d-0205-472b-8356-e10a4b5b3a4a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.641467 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89mh8\" (UniqueName: \"kubernetes.io/projected/441b3c5d-0205-472b-8356-e10a4b5b3a4a-kube-api-access-89mh8\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.642132 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/441b3c5d-0205-472b-8356-e10a4b5b3a4a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.642751 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/441b3c5d-0205-472b-8356-e10a4b5b3a4a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.642822 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/441b3c5d-0205-472b-8356-e10a4b5b3a4a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.642850 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.644935 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/441b3c5d-0205-472b-8356-e10a4b5b3a4a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.646924 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/441b3c5d-0205-472b-8356-e10a4b5b3a4a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.664210 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89mh8\" (UniqueName: \"kubernetes.io/projected/441b3c5d-0205-472b-8356-e10a4b5b3a4a-kube-api-access-89mh8\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.664892 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/441b3c5d-0205-472b-8356-e10a4b5b3a4a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.681612 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/441b3c5d-0205-472b-8356-e10a4b5b3a4a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.702459 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"441b3c5d-0205-472b-8356-e10a4b5b3a4a\") " pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.749150 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.789782 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.804644 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.811204 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.811453 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-chp7n" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.812082 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.847961 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/08494b3a-f49b-49af-8e06-df5f4fac3171-kolla-config\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.848011 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/08494b3a-f49b-49af-8e06-df5f4fac3171-memcached-tls-certs\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.848073 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08494b3a-f49b-49af-8e06-df5f4fac3171-config-data\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.848114 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08494b3a-f49b-49af-8e06-df5f4fac3171-combined-ca-bundle\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.848137 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9x9x\" (UniqueName: \"kubernetes.io/projected/08494b3a-f49b-49af-8e06-df5f4fac3171-kube-api-access-l9x9x\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.851112 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.953207 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/08494b3a-f49b-49af-8e06-df5f4fac3171-kolla-config\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.953769 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/08494b3a-f49b-49af-8e06-df5f4fac3171-memcached-tls-certs\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.953838 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08494b3a-f49b-49af-8e06-df5f4fac3171-config-data\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.953874 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08494b3a-f49b-49af-8e06-df5f4fac3171-combined-ca-bundle\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.953901 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9x9x\" (UniqueName: \"kubernetes.io/projected/08494b3a-f49b-49af-8e06-df5f4fac3171-kube-api-access-l9x9x\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.954211 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/08494b3a-f49b-49af-8e06-df5f4fac3171-kolla-config\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.954834 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08494b3a-f49b-49af-8e06-df5f4fac3171-config-data\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.972468 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/08494b3a-f49b-49af-8e06-df5f4fac3171-memcached-tls-certs\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.975357 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08494b3a-f49b-49af-8e06-df5f4fac3171-combined-ca-bundle\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:28 crc kubenswrapper[4845]: I1006 06:59:28.975970 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9x9x\" (UniqueName: \"kubernetes.io/projected/08494b3a-f49b-49af-8e06-df5f4fac3171-kube-api-access-l9x9x\") pod \"memcached-0\" (UID: \"08494b3a-f49b-49af-8e06-df5f4fac3171\") " pod="openstack/memcached-0" Oct 06 06:59:29 crc kubenswrapper[4845]: I1006 06:59:29.187774 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 06:59:29 crc kubenswrapper[4845]: I1006 06:59:29.299175 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 06:59:29 crc kubenswrapper[4845]: W1006 06:59:29.341012 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d5452f2_c63d_4287_93c4_17b89651a7c1.slice/crio-91bbff08503fb26fe587991e254b8b66db49a871ff59c3e17f84f2f4612454f5 WatchSource:0}: Error finding container 91bbff08503fb26fe587991e254b8b66db49a871ff59c3e17f84f2f4612454f5: Status 404 returned error can't find the container with id 91bbff08503fb26fe587991e254b8b66db49a871ff59c3e17f84f2f4612454f5 Oct 06 06:59:29 crc kubenswrapper[4845]: I1006 06:59:29.438056 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 06:59:29 crc kubenswrapper[4845]: I1006 06:59:29.732851 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"441b3c5d-0205-472b-8356-e10a4b5b3a4a","Type":"ContainerStarted","Data":"42e5aac097692ffd24760fa3eeb0e519787aa2f531b7c9aed72566ced1c6cfa0"} Oct 06 06:59:29 crc kubenswrapper[4845]: I1006 06:59:29.755066 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 06:59:29 crc kubenswrapper[4845]: I1006 06:59:29.764667 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8d5452f2-c63d-4287-93c4-17b89651a7c1","Type":"ContainerStarted","Data":"91bbff08503fb26fe587991e254b8b66db49a871ff59c3e17f84f2f4612454f5"} Oct 06 06:59:30 crc kubenswrapper[4845]: I1006 06:59:30.638665 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 06:59:30 crc kubenswrapper[4845]: I1006 06:59:30.639699 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 06:59:30 crc kubenswrapper[4845]: I1006 06:59:30.644321 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-pwl8p" Oct 06 06:59:30 crc kubenswrapper[4845]: I1006 06:59:30.648691 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 06:59:30 crc kubenswrapper[4845]: I1006 06:59:30.685421 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjhvp\" (UniqueName: \"kubernetes.io/projected/13f70ac7-bc2e-4cfe-a094-f78ec31b3879-kube-api-access-cjhvp\") pod \"kube-state-metrics-0\" (UID: \"13f70ac7-bc2e-4cfe-a094-f78ec31b3879\") " pod="openstack/kube-state-metrics-0" Oct 06 06:59:30 crc kubenswrapper[4845]: I1006 06:59:30.785393 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"08494b3a-f49b-49af-8e06-df5f4fac3171","Type":"ContainerStarted","Data":"de0ff2f78304ea2a815415c8a69934849fe829e5288d5026738ddfd63051aa75"} Oct 06 06:59:30 crc kubenswrapper[4845]: I1006 06:59:30.788243 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjhvp\" (UniqueName: \"kubernetes.io/projected/13f70ac7-bc2e-4cfe-a094-f78ec31b3879-kube-api-access-cjhvp\") pod \"kube-state-metrics-0\" (UID: \"13f70ac7-bc2e-4cfe-a094-f78ec31b3879\") " pod="openstack/kube-state-metrics-0" Oct 06 06:59:30 crc kubenswrapper[4845]: I1006 06:59:30.834388 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjhvp\" (UniqueName: \"kubernetes.io/projected/13f70ac7-bc2e-4cfe-a094-f78ec31b3879-kube-api-access-cjhvp\") pod \"kube-state-metrics-0\" (UID: \"13f70ac7-bc2e-4cfe-a094-f78ec31b3879\") " pod="openstack/kube-state-metrics-0" Oct 06 06:59:30 crc kubenswrapper[4845]: I1006 06:59:30.970442 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 06:59:31 crc kubenswrapper[4845]: I1006 06:59:31.440310 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 06:59:31 crc kubenswrapper[4845]: I1006 06:59:31.793174 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"13f70ac7-bc2e-4cfe-a094-f78ec31b3879","Type":"ContainerStarted","Data":"978d205a1677d0f0110cc95cdd27ecb90389f2c95b3b36bc29c7ee371e49efd0"} Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.901146 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v4zd6"] Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.902269 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.905200 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hlg7p" Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.905436 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.905604 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.943484 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v4zd6"] Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.961531 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ee0908-39a9-4303-aad3-040a922d20a7-scripts\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.961633 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2dq6\" (UniqueName: \"kubernetes.io/projected/e2ee0908-39a9-4303-aad3-040a922d20a7-kube-api-access-c2dq6\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.961665 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2ee0908-39a9-4303-aad3-040a922d20a7-combined-ca-bundle\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.961696 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2ee0908-39a9-4303-aad3-040a922d20a7-ovn-controller-tls-certs\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.961719 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ee0908-39a9-4303-aad3-040a922d20a7-var-run\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.961746 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ee0908-39a9-4303-aad3-040a922d20a7-var-run-ovn\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.961809 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ee0908-39a9-4303-aad3-040a922d20a7-var-log-ovn\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.962555 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-n9jwg"] Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.964532 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:33 crc kubenswrapper[4845]: I1006 06:59:33.986484 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-n9jwg"] Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.063490 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ee0908-39a9-4303-aad3-040a922d20a7-var-run-ovn\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.064137 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ee0908-39a9-4303-aad3-040a922d20a7-var-run-ovn\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.064426 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmcjw\" (UniqueName: \"kubernetes.io/projected/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-kube-api-access-hmcjw\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.064484 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ee0908-39a9-4303-aad3-040a922d20a7-var-log-ovn\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.064524 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-var-lib\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.064562 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ee0908-39a9-4303-aad3-040a922d20a7-scripts\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.064638 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-scripts\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.064702 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2dq6\" (UniqueName: \"kubernetes.io/projected/e2ee0908-39a9-4303-aad3-040a922d20a7-kube-api-access-c2dq6\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.064641 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ee0908-39a9-4303-aad3-040a922d20a7-var-log-ovn\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.064779 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2ee0908-39a9-4303-aad3-040a922d20a7-combined-ca-bundle\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.064832 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-etc-ovs\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.064863 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2ee0908-39a9-4303-aad3-040a922d20a7-ovn-controller-tls-certs\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.064896 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-var-log\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.065002 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ee0908-39a9-4303-aad3-040a922d20a7-var-run\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.065033 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-var-run\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.065482 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ee0908-39a9-4303-aad3-040a922d20a7-var-run\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.067409 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ee0908-39a9-4303-aad3-040a922d20a7-scripts\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.073161 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2ee0908-39a9-4303-aad3-040a922d20a7-ovn-controller-tls-certs\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.100179 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2ee0908-39a9-4303-aad3-040a922d20a7-combined-ca-bundle\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.119302 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2dq6\" (UniqueName: \"kubernetes.io/projected/e2ee0908-39a9-4303-aad3-040a922d20a7-kube-api-access-c2dq6\") pod \"ovn-controller-v4zd6\" (UID: \"e2ee0908-39a9-4303-aad3-040a922d20a7\") " pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.166575 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-var-log\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.166624 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-var-run\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.166660 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmcjw\" (UniqueName: \"kubernetes.io/projected/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-kube-api-access-hmcjw\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.166689 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-var-lib\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.166726 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-scripts\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.166783 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-etc-ovs\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.167113 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-etc-ovs\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.167198 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-var-log\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.167234 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-var-run\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.167794 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-var-lib\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.169941 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-scripts\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.203925 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmcjw\" (UniqueName: \"kubernetes.io/projected/1e7e45f8-ca4d-473e-9c7e-12bb2626a080-kube-api-access-hmcjw\") pod \"ovn-controller-ovs-n9jwg\" (UID: \"1e7e45f8-ca4d-473e-9c7e-12bb2626a080\") " pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.299260 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.315762 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.556170 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.557454 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.560076 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.560603 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.560647 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.561030 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.561636 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-v44n4" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.568010 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.684536 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dad007b-9982-4f85-842c-083964cd2734-config\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.684621 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dad007b-9982-4f85-842c-083964cd2734-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.684779 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3dad007b-9982-4f85-842c-083964cd2734-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.684894 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57k4r\" (UniqueName: \"kubernetes.io/projected/3dad007b-9982-4f85-842c-083964cd2734-kube-api-access-57k4r\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.684920 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.685076 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dad007b-9982-4f85-842c-083964cd2734-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.685133 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3dad007b-9982-4f85-842c-083964cd2734-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.685150 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dad007b-9982-4f85-842c-083964cd2734-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.786172 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dad007b-9982-4f85-842c-083964cd2734-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.786231 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3dad007b-9982-4f85-842c-083964cd2734-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.786266 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57k4r\" (UniqueName: \"kubernetes.io/projected/3dad007b-9982-4f85-842c-083964cd2734-kube-api-access-57k4r\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.786287 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.786934 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.787276 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3dad007b-9982-4f85-842c-083964cd2734-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.786361 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dad007b-9982-4f85-842c-083964cd2734-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.789112 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3dad007b-9982-4f85-842c-083964cd2734-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.789718 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3dad007b-9982-4f85-842c-083964cd2734-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.797532 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dad007b-9982-4f85-842c-083964cd2734-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.789910 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dad007b-9982-4f85-842c-083964cd2734-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.797730 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dad007b-9982-4f85-842c-083964cd2734-config\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.798572 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dad007b-9982-4f85-842c-083964cd2734-config\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.800927 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dad007b-9982-4f85-842c-083964cd2734-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.801865 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dad007b-9982-4f85-842c-083964cd2734-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.803741 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57k4r\" (UniqueName: \"kubernetes.io/projected/3dad007b-9982-4f85-842c-083964cd2734-kube-api-access-57k4r\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.806792 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3dad007b-9982-4f85-842c-083964cd2734\") " pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:34 crc kubenswrapper[4845]: I1006 06:59:34.886651 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.877129 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.879258 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.883804 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.883868 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-cg8dm" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.885291 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.885549 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.893124 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.966201 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952ffa29-f400-4b01-a4b7-282a401db753-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.966257 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.966405 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhkhr\" (UniqueName: \"kubernetes.io/projected/952ffa29-f400-4b01-a4b7-282a401db753-kube-api-access-fhkhr\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.966444 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/952ffa29-f400-4b01-a4b7-282a401db753-config\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.966461 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/952ffa29-f400-4b01-a4b7-282a401db753-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.966491 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/952ffa29-f400-4b01-a4b7-282a401db753-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.966670 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/952ffa29-f400-4b01-a4b7-282a401db753-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:37 crc kubenswrapper[4845]: I1006 06:59:37.966795 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/952ffa29-f400-4b01-a4b7-282a401db753-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.069093 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/952ffa29-f400-4b01-a4b7-282a401db753-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.069162 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/952ffa29-f400-4b01-a4b7-282a401db753-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.069208 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952ffa29-f400-4b01-a4b7-282a401db753-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.069237 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.069277 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhkhr\" (UniqueName: \"kubernetes.io/projected/952ffa29-f400-4b01-a4b7-282a401db753-kube-api-access-fhkhr\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.069316 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/952ffa29-f400-4b01-a4b7-282a401db753-config\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.069341 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/952ffa29-f400-4b01-a4b7-282a401db753-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.069404 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/952ffa29-f400-4b01-a4b7-282a401db753-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.069574 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.070498 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/952ffa29-f400-4b01-a4b7-282a401db753-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.070560 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/952ffa29-f400-4b01-a4b7-282a401db753-config\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.071311 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/952ffa29-f400-4b01-a4b7-282a401db753-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.076062 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/952ffa29-f400-4b01-a4b7-282a401db753-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.090940 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhkhr\" (UniqueName: \"kubernetes.io/projected/952ffa29-f400-4b01-a4b7-282a401db753-kube-api-access-fhkhr\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.101055 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952ffa29-f400-4b01-a4b7-282a401db753-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.101584 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/952ffa29-f400-4b01-a4b7-282a401db753-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.120030 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"952ffa29-f400-4b01-a4b7-282a401db753\") " pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:38 crc kubenswrapper[4845]: I1006 06:59:38.204999 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 06:59:40 crc kubenswrapper[4845]: I1006 06:59:40.103592 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-n9jwg"] Oct 06 06:59:44 crc kubenswrapper[4845]: I1006 06:59:44.592392 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 06:59:44 crc kubenswrapper[4845]: I1006 06:59:44.605601 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v4zd6"] Oct 06 06:59:44 crc kubenswrapper[4845]: I1006 06:59:44.926793 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-n9jwg" event={"ID":"1e7e45f8-ca4d-473e-9c7e-12bb2626a080","Type":"ContainerStarted","Data":"cbba62626929c18ead4db4714955a944f6805e175c09747aa07f796a314069f6"} Oct 06 06:59:52 crc kubenswrapper[4845]: E1006 06:59:52.062994 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:b78cfc68a577b1553523c8a70a34e297" Oct 06 06:59:52 crc kubenswrapper[4845]: E1006 06:59:52.063820 4845 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:b78cfc68a577b1553523c8a70a34e297" Oct 06 06:59:52 crc kubenswrapper[4845]: E1006 06:59:52.064033 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:b78cfc68a577b1553523c8a70a34e297,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbp69,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(38d9a5cf-6de3-487c-a71c-374ca55ca525): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 06:59:52 crc kubenswrapper[4845]: E1006 06:59:52.065333 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="38d9a5cf-6de3-487c-a71c-374ca55ca525" Oct 06 06:59:53 crc kubenswrapper[4845]: I1006 06:59:53.018711 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 06:59:53 crc kubenswrapper[4845]: I1006 06:59:53.019123 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 06:59:53 crc kubenswrapper[4845]: I1006 06:59:53.019048 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3dad007b-9982-4f85-842c-083964cd2734","Type":"ContainerStarted","Data":"48505458fe6dd652abead42fbbff293c1115e4ea4a9abd02c9d52ebad7c1ba34"} Oct 06 06:59:53 crc kubenswrapper[4845]: I1006 06:59:53.021129 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4zd6" event={"ID":"e2ee0908-39a9-4303-aad3-040a922d20a7","Type":"ContainerStarted","Data":"f7fa7e2423b9864421a9f4eb841d1f1fc7ced7f5806e5aace510900c4341e49b"} Oct 06 06:59:53 crc kubenswrapper[4845]: E1006 06:59:53.025188 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:b78cfc68a577b1553523c8a70a34e297\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="38d9a5cf-6de3-487c-a71c-374ca55ca525" Oct 06 06:59:54 crc kubenswrapper[4845]: I1006 06:59:54.360137 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 06:59:56 crc kubenswrapper[4845]: I1006 06:59:56.047653 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"952ffa29-f400-4b01-a4b7-282a401db753","Type":"ContainerStarted","Data":"6adcfb9664351be4a5102fd64d7add126e8ba9a088a56e707e68f18854acec5f"} Oct 06 06:59:57 crc kubenswrapper[4845]: I1006 06:59:57.062277 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"08494b3a-f49b-49af-8e06-df5f4fac3171","Type":"ContainerStarted","Data":"9594a5886d35b8ebbd7cda12ec39216359cdfad3f48cfd86d58ae2c2ef069c7b"} Oct 06 06:59:57 crc kubenswrapper[4845]: E1006 06:59:57.274096 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d353f56_d5d7_41c5_941a_fa4dd569b3d8.slice/crio-conmon-54d064a56f146c7e178e135b3e2fd75a1eb0fa5f4138c1a95876dd7029242874.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d353f56_d5d7_41c5_941a_fa4dd569b3d8.slice/crio-54d064a56f146c7e178e135b3e2fd75a1eb0fa5f4138c1a95876dd7029242874.scope\": RecentStats: unable to find data in memory cache]" Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.071277 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"952ffa29-f400-4b01-a4b7-282a401db753","Type":"ContainerStarted","Data":"7533555892f4e57f1b0336419ef85d32bba84d7c392cb6c7c78679ce4699fb08"} Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.073683 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8d5452f2-c63d-4287-93c4-17b89651a7c1","Type":"ContainerStarted","Data":"c646108b662cee3dae7b57eb0a78ce79dd4cc903192ff502449206048d67230e"} Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.076278 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" event={"ID":"a12043d8-5d3d-4eb7-918e-c8b620d880ca","Type":"ContainerStarted","Data":"8847c4e5abc6c866d6acd047931e122a9d7670c1599b52b4a88b34607c715f4b"} Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.078913 4845 generic.go:334] "Generic (PLEG): container finished" podID="6d353f56-d5d7-41c5-941a-fa4dd569b3d8" containerID="54d064a56f146c7e178e135b3e2fd75a1eb0fa5f4138c1a95876dd7029242874" exitCode=0 Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.079024 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d7c66485-5k26z" event={"ID":"6d353f56-d5d7-41c5-941a-fa4dd569b3d8","Type":"ContainerDied","Data":"54d064a56f146c7e178e135b3e2fd75a1eb0fa5f4138c1a95876dd7029242874"} Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.079144 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.134048 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=5.958341505 podStartE2EDuration="30.13402606s" podCreationTimestamp="2025-10-06 06:59:28 +0000 UTC" firstStartedPulling="2025-10-06 06:59:29.810204636 +0000 UTC m=+854.324945644" lastFinishedPulling="2025-10-06 06:59:53.985889191 +0000 UTC m=+878.500630199" observedRunningTime="2025-10-06 06:59:58.130397119 +0000 UTC m=+882.645138137" watchObservedRunningTime="2025-10-06 06:59:58.13402606 +0000 UTC m=+882.648767078" Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.533920 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.670197 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-dns-svc\") pod \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\" (UID: \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\") " Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.670358 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-config\") pod \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\" (UID: \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\") " Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.670477 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8pz5\" (UniqueName: \"kubernetes.io/projected/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-kube-api-access-d8pz5\") pod \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\" (UID: \"6d353f56-d5d7-41c5-941a-fa4dd569b3d8\") " Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.680209 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-kube-api-access-d8pz5" (OuterVolumeSpecName: "kube-api-access-d8pz5") pod "6d353f56-d5d7-41c5-941a-fa4dd569b3d8" (UID: "6d353f56-d5d7-41c5-941a-fa4dd569b3d8"). InnerVolumeSpecName "kube-api-access-d8pz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.694167 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-config" (OuterVolumeSpecName: "config") pod "6d353f56-d5d7-41c5-941a-fa4dd569b3d8" (UID: "6d353f56-d5d7-41c5-941a-fa4dd569b3d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.698717 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d353f56-d5d7-41c5-941a-fa4dd569b3d8" (UID: "6d353f56-d5d7-41c5-941a-fa4dd569b3d8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.772089 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8pz5\" (UniqueName: \"kubernetes.io/projected/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-kube-api-access-d8pz5\") on node \"crc\" DevicePath \"\"" Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.772129 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 06:59:58 crc kubenswrapper[4845]: I1006 06:59:58.772139 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d353f56-d5d7-41c5-941a-fa4dd569b3d8-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.094195 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d7c66485-5k26z" event={"ID":"6d353f56-d5d7-41c5-941a-fa4dd569b3d8","Type":"ContainerDied","Data":"8bcf38cde8ae3069f932f1b04c5d2c24efc1b4efd9e314cfdf0a24350aec905f"} Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.094287 4845 scope.go:117] "RemoveContainer" containerID="54d064a56f146c7e178e135b3e2fd75a1eb0fa5f4138c1a95876dd7029242874" Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.094462 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d7c66485-5k26z" Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.108499 4845 generic.go:334] "Generic (PLEG): container finished" podID="1e7e45f8-ca4d-473e-9c7e-12bb2626a080" containerID="5baf1d6a520ffe9379dc74018b817c02b6c5a014e4cf3804d10807b81487c339" exitCode=0 Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.108612 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-n9jwg" event={"ID":"1e7e45f8-ca4d-473e-9c7e-12bb2626a080","Type":"ContainerDied","Data":"5baf1d6a520ffe9379dc74018b817c02b6c5a014e4cf3804d10807b81487c339"} Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.111945 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a669571-1ec3-4cb8-8a07-e20c31ca87e5","Type":"ContainerStarted","Data":"96190030a88ff80858e707644abc72592e8c49a60ec3b2a2aedfd6d04a82a1df"} Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.114050 4845 generic.go:334] "Generic (PLEG): container finished" podID="e6c48b55-c500-4427-a66e-ddaa189a08af" containerID="67daad6347c7369a20594650d1c9cf0167bcc13b56c2b21d2cb35ea1a36c91c7" exitCode=0 Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.114120 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" event={"ID":"e6c48b55-c500-4427-a66e-ddaa189a08af","Type":"ContainerDied","Data":"67daad6347c7369a20594650d1c9cf0167bcc13b56c2b21d2cb35ea1a36c91c7"} Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.117681 4845 generic.go:334] "Generic (PLEG): container finished" podID="199df9d9-f617-44a5-8afe-9e5d086c249b" containerID="b4758d64fef3098f5ed37068f25299d3d54da4c7e15d729f88f20491ba6bfa58" exitCode=0 Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.117762 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" event={"ID":"199df9d9-f617-44a5-8afe-9e5d086c249b","Type":"ContainerDied","Data":"b4758d64fef3098f5ed37068f25299d3d54da4c7e15d729f88f20491ba6bfa58"} Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.123291 4845 generic.go:334] "Generic (PLEG): container finished" podID="a12043d8-5d3d-4eb7-918e-c8b620d880ca" containerID="8847c4e5abc6c866d6acd047931e122a9d7670c1599b52b4a88b34607c715f4b" exitCode=0 Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.123363 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" event={"ID":"a12043d8-5d3d-4eb7-918e-c8b620d880ca","Type":"ContainerDied","Data":"8847c4e5abc6c866d6acd047931e122a9d7670c1599b52b4a88b34607c715f4b"} Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.130113 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4zd6" event={"ID":"e2ee0908-39a9-4303-aad3-040a922d20a7","Type":"ContainerStarted","Data":"d722495be0a3c99862991f434abe4e3f684bbc4207f4688108b15b64281a966f"} Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.130609 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-v4zd6" Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.133312 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3dad007b-9982-4f85-842c-083964cd2734","Type":"ContainerStarted","Data":"6c678705af1f9b3df74b42305eda68b8408b0478f7149118fb3f9c5c72100dfd"} Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.140644 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"13f70ac7-bc2e-4cfe-a094-f78ec31b3879","Type":"ContainerStarted","Data":"82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4"} Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.140826 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.146487 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"441b3c5d-0205-472b-8356-e10a4b5b3a4a","Type":"ContainerStarted","Data":"2e8eb11c3a6714703ca50e96ed9b2597b8e25eb366c5b6aeb4895de93ae7500b"} Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.328969 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d7c66485-5k26z"] Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.335483 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9d7c66485-5k26z"] Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.347104 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-v4zd6" podStartSLOduration=21.625419337 podStartE2EDuration="26.347087472s" podCreationTimestamp="2025-10-06 06:59:33 +0000 UTC" firstStartedPulling="2025-10-06 06:59:52.052280261 +0000 UTC m=+876.567021279" lastFinishedPulling="2025-10-06 06:59:56.773948406 +0000 UTC m=+881.288689414" observedRunningTime="2025-10-06 06:59:59.346545809 +0000 UTC m=+883.861286817" watchObservedRunningTime="2025-10-06 06:59:59.347087472 +0000 UTC m=+883.861828480" Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.365457 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.20171318 podStartE2EDuration="29.365441034s" podCreationTimestamp="2025-10-06 06:59:30 +0000 UTC" firstStartedPulling="2025-10-06 06:59:31.454984495 +0000 UTC m=+855.969725503" lastFinishedPulling="2025-10-06 06:59:57.618712349 +0000 UTC m=+882.133453357" observedRunningTime="2025-10-06 06:59:59.361437573 +0000 UTC m=+883.876178581" watchObservedRunningTime="2025-10-06 06:59:59.365441034 +0000 UTC m=+883.880182042" Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.548011 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.594284 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6c9l\" (UniqueName: \"kubernetes.io/projected/e6c48b55-c500-4427-a66e-ddaa189a08af-kube-api-access-f6c9l\") pod \"e6c48b55-c500-4427-a66e-ddaa189a08af\" (UID: \"e6c48b55-c500-4427-a66e-ddaa189a08af\") " Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.594914 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c48b55-c500-4427-a66e-ddaa189a08af-config\") pod \"e6c48b55-c500-4427-a66e-ddaa189a08af\" (UID: \"e6c48b55-c500-4427-a66e-ddaa189a08af\") " Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.621629 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c48b55-c500-4427-a66e-ddaa189a08af-kube-api-access-f6c9l" (OuterVolumeSpecName: "kube-api-access-f6c9l") pod "e6c48b55-c500-4427-a66e-ddaa189a08af" (UID: "e6c48b55-c500-4427-a66e-ddaa189a08af"). InnerVolumeSpecName "kube-api-access-f6c9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.625724 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6c48b55-c500-4427-a66e-ddaa189a08af-config" (OuterVolumeSpecName: "config") pod "e6c48b55-c500-4427-a66e-ddaa189a08af" (UID: "e6c48b55-c500-4427-a66e-ddaa189a08af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.696412 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c48b55-c500-4427-a66e-ddaa189a08af-config\") on node \"crc\" DevicePath \"\"" Oct 06 06:59:59 crc kubenswrapper[4845]: I1006 06:59:59.696452 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6c9l\" (UniqueName: \"kubernetes.io/projected/e6c48b55-c500-4427-a66e-ddaa189a08af-kube-api-access-f6c9l\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.130902 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z"] Oct 06 07:00:00 crc kubenswrapper[4845]: E1006 07:00:00.131293 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d353f56-d5d7-41c5-941a-fa4dd569b3d8" containerName="init" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.131307 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d353f56-d5d7-41c5-941a-fa4dd569b3d8" containerName="init" Oct 06 07:00:00 crc kubenswrapper[4845]: E1006 07:00:00.131339 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c48b55-c500-4427-a66e-ddaa189a08af" containerName="init" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.131348 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c48b55-c500-4427-a66e-ddaa189a08af" containerName="init" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.131556 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d353f56-d5d7-41c5-941a-fa4dd569b3d8" containerName="init" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.131570 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c48b55-c500-4427-a66e-ddaa189a08af" containerName="init" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.132228 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.135765 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.135971 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.153891 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z"] Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.163119 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" event={"ID":"199df9d9-f617-44a5-8afe-9e5d086c249b","Type":"ContainerStarted","Data":"c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2"} Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.164347 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.172661 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" event={"ID":"e6c48b55-c500-4427-a66e-ddaa189a08af","Type":"ContainerDied","Data":"5e1dbc4241db30e914d85b5a6f1d4b69ab850a9c9bad00ffee2bee99c743ab29"} Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.172716 4845 scope.go:117] "RemoveContainer" containerID="67daad6347c7369a20594650d1c9cf0167bcc13b56c2b21d2cb35ea1a36c91c7" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.172840 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db48c6849-5wxlp" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.188525 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" podStartSLOduration=5.7706099779999995 podStartE2EDuration="36.188507901s" podCreationTimestamp="2025-10-06 06:59:24 +0000 UTC" firstStartedPulling="2025-10-06 06:59:25.233317894 +0000 UTC m=+849.748058902" lastFinishedPulling="2025-10-06 06:59:55.651215817 +0000 UTC m=+880.165956825" observedRunningTime="2025-10-06 07:00:00.183216588 +0000 UTC m=+884.697957596" watchObservedRunningTime="2025-10-06 07:00:00.188507901 +0000 UTC m=+884.703248909" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.204621 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49846891-c3bb-4413-a9ba-1d58fb45faf5-secret-volume\") pod \"collect-profiles-29328900-7p79z\" (UID: \"49846891-c3bb-4413-a9ba-1d58fb45faf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.204687 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlds2\" (UniqueName: \"kubernetes.io/projected/49846891-c3bb-4413-a9ba-1d58fb45faf5-kube-api-access-zlds2\") pod \"collect-profiles-29328900-7p79z\" (UID: \"49846891-c3bb-4413-a9ba-1d58fb45faf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.204708 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49846891-c3bb-4413-a9ba-1d58fb45faf5-config-volume\") pod \"collect-profiles-29328900-7p79z\" (UID: \"49846891-c3bb-4413-a9ba-1d58fb45faf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.206850 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" event={"ID":"a12043d8-5d3d-4eb7-918e-c8b620d880ca","Type":"ContainerStarted","Data":"7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed"} Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.207027 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.224586 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6db48c6849-5wxlp"] Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.224633 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.224649 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.224658 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-n9jwg" event={"ID":"1e7e45f8-ca4d-473e-9c7e-12bb2626a080","Type":"ContainerStarted","Data":"5641aec66c710473e35b1af0cba900659eb976b1e36b03496573ac68812ce48f"} Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.224672 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-n9jwg" event={"ID":"1e7e45f8-ca4d-473e-9c7e-12bb2626a080","Type":"ContainerStarted","Data":"79a8776afc18535a11a30c9cd6dcb35cec243fe0cc5bd44aa923394b493822b9"} Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.252362 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" podStartSLOduration=6.377575455 podStartE2EDuration="36.252321537s" podCreationTimestamp="2025-10-06 06:59:24 +0000 UTC" firstStartedPulling="2025-10-06 06:59:25.49139913 +0000 UTC m=+850.006140138" lastFinishedPulling="2025-10-06 06:59:55.366145212 +0000 UTC m=+879.880886220" observedRunningTime="2025-10-06 07:00:00.238865039 +0000 UTC m=+884.753606067" watchObservedRunningTime="2025-10-06 07:00:00.252321537 +0000 UTC m=+884.767062605" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.258062 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d353f56-d5d7-41c5-941a-fa4dd569b3d8" path="/var/lib/kubelet/pods/6d353f56-d5d7-41c5-941a-fa4dd569b3d8/volumes" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.261189 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6db48c6849-5wxlp"] Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.275043 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-n9jwg" podStartSLOduration=16.245517995 podStartE2EDuration="27.275016339s" podCreationTimestamp="2025-10-06 06:59:33 +0000 UTC" firstStartedPulling="2025-10-06 06:59:44.886843146 +0000 UTC m=+869.401584154" lastFinishedPulling="2025-10-06 06:59:55.91634149 +0000 UTC m=+880.431082498" observedRunningTime="2025-10-06 07:00:00.274918876 +0000 UTC m=+884.789659894" watchObservedRunningTime="2025-10-06 07:00:00.275016339 +0000 UTC m=+884.789757347" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.305719 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlds2\" (UniqueName: \"kubernetes.io/projected/49846891-c3bb-4413-a9ba-1d58fb45faf5-kube-api-access-zlds2\") pod \"collect-profiles-29328900-7p79z\" (UID: \"49846891-c3bb-4413-a9ba-1d58fb45faf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.305766 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49846891-c3bb-4413-a9ba-1d58fb45faf5-config-volume\") pod \"collect-profiles-29328900-7p79z\" (UID: \"49846891-c3bb-4413-a9ba-1d58fb45faf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.306175 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49846891-c3bb-4413-a9ba-1d58fb45faf5-secret-volume\") pod \"collect-profiles-29328900-7p79z\" (UID: \"49846891-c3bb-4413-a9ba-1d58fb45faf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.308042 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49846891-c3bb-4413-a9ba-1d58fb45faf5-config-volume\") pod \"collect-profiles-29328900-7p79z\" (UID: \"49846891-c3bb-4413-a9ba-1d58fb45faf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.314910 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49846891-c3bb-4413-a9ba-1d58fb45faf5-secret-volume\") pod \"collect-profiles-29328900-7p79z\" (UID: \"49846891-c3bb-4413-a9ba-1d58fb45faf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.322474 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlds2\" (UniqueName: \"kubernetes.io/projected/49846891-c3bb-4413-a9ba-1d58fb45faf5-kube-api-access-zlds2\") pod \"collect-profiles-29328900-7p79z\" (UID: \"49846891-c3bb-4413-a9ba-1d58fb45faf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:00 crc kubenswrapper[4845]: I1006 07:00:00.454547 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:01 crc kubenswrapper[4845]: I1006 07:00:01.231193 4845 generic.go:334] "Generic (PLEG): container finished" podID="8d5452f2-c63d-4287-93c4-17b89651a7c1" containerID="c646108b662cee3dae7b57eb0a78ce79dd4cc903192ff502449206048d67230e" exitCode=0 Oct 06 07:00:01 crc kubenswrapper[4845]: I1006 07:00:01.231270 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8d5452f2-c63d-4287-93c4-17b89651a7c1","Type":"ContainerDied","Data":"c646108b662cee3dae7b57eb0a78ce79dd4cc903192ff502449206048d67230e"} Oct 06 07:00:01 crc kubenswrapper[4845]: I1006 07:00:01.926099 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z"] Oct 06 07:00:01 crc kubenswrapper[4845]: W1006 07:00:01.936472 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49846891_c3bb_4413_a9ba_1d58fb45faf5.slice/crio-5c7683dc03e6215ad0482739e9b0161491d47a086ab31b0cc9e781a0709e2aea WatchSource:0}: Error finding container 5c7683dc03e6215ad0482739e9b0161491d47a086ab31b0cc9e781a0709e2aea: Status 404 returned error can't find the container with id 5c7683dc03e6215ad0482739e9b0161491d47a086ab31b0cc9e781a0709e2aea Oct 06 07:00:02 crc kubenswrapper[4845]: I1006 07:00:02.238909 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6c48b55-c500-4427-a66e-ddaa189a08af" path="/var/lib/kubelet/pods/e6c48b55-c500-4427-a66e-ddaa189a08af/volumes" Oct 06 07:00:02 crc kubenswrapper[4845]: I1006 07:00:02.253043 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3dad007b-9982-4f85-842c-083964cd2734","Type":"ContainerStarted","Data":"001466b05775831ad573937ae09f854b4c3fd036d2feb16969202d8b0af02d65"} Oct 06 07:00:02 crc kubenswrapper[4845]: I1006 07:00:02.256126 4845 generic.go:334] "Generic (PLEG): container finished" podID="441b3c5d-0205-472b-8356-e10a4b5b3a4a" containerID="2e8eb11c3a6714703ca50e96ed9b2597b8e25eb366c5b6aeb4895de93ae7500b" exitCode=0 Oct 06 07:00:02 crc kubenswrapper[4845]: I1006 07:00:02.256248 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"441b3c5d-0205-472b-8356-e10a4b5b3a4a","Type":"ContainerDied","Data":"2e8eb11c3a6714703ca50e96ed9b2597b8e25eb366c5b6aeb4895de93ae7500b"} Oct 06 07:00:02 crc kubenswrapper[4845]: I1006 07:00:02.263281 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"952ffa29-f400-4b01-a4b7-282a401db753","Type":"ContainerStarted","Data":"b38da246b937229e3a9c1d30c0a3a6199dd660191d3308a0607dbfc611fd89e4"} Oct 06 07:00:02 crc kubenswrapper[4845]: I1006 07:00:02.267951 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8d5452f2-c63d-4287-93c4-17b89651a7c1","Type":"ContainerStarted","Data":"4f77852f8cec2f246ed32108487f5dd504ba0ab6c7c3566e2bc336c979b40e6c"} Oct 06 07:00:02 crc kubenswrapper[4845]: I1006 07:00:02.270508 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" event={"ID":"49846891-c3bb-4413-a9ba-1d58fb45faf5","Type":"ContainerStarted","Data":"51ee1108f15883e99ac556879c87186836d445d0844cc2a8e8007d3a7a9aa4c6"} Oct 06 07:00:02 crc kubenswrapper[4845]: I1006 07:00:02.270555 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" event={"ID":"49846891-c3bb-4413-a9ba-1d58fb45faf5","Type":"ContainerStarted","Data":"5c7683dc03e6215ad0482739e9b0161491d47a086ab31b0cc9e781a0709e2aea"} Oct 06 07:00:02 crc kubenswrapper[4845]: I1006 07:00:02.276315 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=19.838223704 podStartE2EDuration="29.276298702s" podCreationTimestamp="2025-10-06 06:59:33 +0000 UTC" firstStartedPulling="2025-10-06 06:59:52.051986774 +0000 UTC m=+876.566727812" lastFinishedPulling="2025-10-06 07:00:01.490061802 +0000 UTC m=+886.004802810" observedRunningTime="2025-10-06 07:00:02.274996249 +0000 UTC m=+886.789737267" watchObservedRunningTime="2025-10-06 07:00:02.276298702 +0000 UTC m=+886.791039710" Oct 06 07:00:02 crc kubenswrapper[4845]: I1006 07:00:02.301959 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.286214208 podStartE2EDuration="35.301940297s" podCreationTimestamp="2025-10-06 06:59:27 +0000 UTC" firstStartedPulling="2025-10-06 06:59:29.351627624 +0000 UTC m=+853.866368632" lastFinishedPulling="2025-10-06 06:59:55.367353713 +0000 UTC m=+879.882094721" observedRunningTime="2025-10-06 07:00:02.2976689 +0000 UTC m=+886.812409928" watchObservedRunningTime="2025-10-06 07:00:02.301940297 +0000 UTC m=+886.816681305" Oct 06 07:00:02 crc kubenswrapper[4845]: I1006 07:00:02.318060 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" podStartSLOduration=2.318039112 podStartE2EDuration="2.318039112s" podCreationTimestamp="2025-10-06 07:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:00:02.313932779 +0000 UTC m=+886.828673787" watchObservedRunningTime="2025-10-06 07:00:02.318039112 +0000 UTC m=+886.832780110" Oct 06 07:00:02 crc kubenswrapper[4845]: I1006 07:00:02.339722 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=20.135046516 podStartE2EDuration="26.339702008s" podCreationTimestamp="2025-10-06 06:59:36 +0000 UTC" firstStartedPulling="2025-10-06 06:59:55.29254995 +0000 UTC m=+879.807290958" lastFinishedPulling="2025-10-06 07:00:01.497205432 +0000 UTC m=+886.011946450" observedRunningTime="2025-10-06 07:00:02.33385089 +0000 UTC m=+886.848591898" watchObservedRunningTime="2025-10-06 07:00:02.339702008 +0000 UTC m=+886.854443016" Oct 06 07:00:03 crc kubenswrapper[4845]: I1006 07:00:03.205537 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 06 07:00:03 crc kubenswrapper[4845]: I1006 07:00:03.310271 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"441b3c5d-0205-472b-8356-e10a4b5b3a4a","Type":"ContainerStarted","Data":"43d04b4d48a19eb036bbd74423902b0eb8f0c744f10f3ca77705e1b2789775e0"} Oct 06 07:00:03 crc kubenswrapper[4845]: I1006 07:00:03.314863 4845 generic.go:334] "Generic (PLEG): container finished" podID="49846891-c3bb-4413-a9ba-1d58fb45faf5" containerID="51ee1108f15883e99ac556879c87186836d445d0844cc2a8e8007d3a7a9aa4c6" exitCode=0 Oct 06 07:00:03 crc kubenswrapper[4845]: I1006 07:00:03.314968 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" event={"ID":"49846891-c3bb-4413-a9ba-1d58fb45faf5","Type":"ContainerDied","Data":"51ee1108f15883e99ac556879c87186836d445d0844cc2a8e8007d3a7a9aa4c6"} Oct 06 07:00:03 crc kubenswrapper[4845]: I1006 07:00:03.339802 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=10.440969173 podStartE2EDuration="36.33978308s" podCreationTimestamp="2025-10-06 06:59:27 +0000 UTC" firstStartedPulling="2025-10-06 06:59:29.467425788 +0000 UTC m=+853.982166796" lastFinishedPulling="2025-10-06 06:59:55.366239695 +0000 UTC m=+879.880980703" observedRunningTime="2025-10-06 07:00:03.335536213 +0000 UTC m=+887.850277241" watchObservedRunningTime="2025-10-06 07:00:03.33978308 +0000 UTC m=+887.854524078" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.189533 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.587463 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.594450 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.685888 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlds2\" (UniqueName: \"kubernetes.io/projected/49846891-c3bb-4413-a9ba-1d58fb45faf5-kube-api-access-zlds2\") pod \"49846891-c3bb-4413-a9ba-1d58fb45faf5\" (UID: \"49846891-c3bb-4413-a9ba-1d58fb45faf5\") " Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.685995 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49846891-c3bb-4413-a9ba-1d58fb45faf5-config-volume\") pod \"49846891-c3bb-4413-a9ba-1d58fb45faf5\" (UID: \"49846891-c3bb-4413-a9ba-1d58fb45faf5\") " Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.686114 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49846891-c3bb-4413-a9ba-1d58fb45faf5-secret-volume\") pod \"49846891-c3bb-4413-a9ba-1d58fb45faf5\" (UID: \"49846891-c3bb-4413-a9ba-1d58fb45faf5\") " Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.686733 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49846891-c3bb-4413-a9ba-1d58fb45faf5-config-volume" (OuterVolumeSpecName: "config-volume") pod "49846891-c3bb-4413-a9ba-1d58fb45faf5" (UID: "49846891-c3bb-4413-a9ba-1d58fb45faf5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.691850 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49846891-c3bb-4413-a9ba-1d58fb45faf5-kube-api-access-zlds2" (OuterVolumeSpecName: "kube-api-access-zlds2") pod "49846891-c3bb-4413-a9ba-1d58fb45faf5" (UID: "49846891-c3bb-4413-a9ba-1d58fb45faf5"). InnerVolumeSpecName "kube-api-access-zlds2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.692431 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49846891-c3bb-4413-a9ba-1d58fb45faf5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "49846891-c3bb-4413-a9ba-1d58fb45faf5" (UID: "49846891-c3bb-4413-a9ba-1d58fb45faf5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.788043 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49846891-c3bb-4413-a9ba-1d58fb45faf5-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.788085 4845 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49846891-c3bb-4413-a9ba-1d58fb45faf5-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.788096 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlds2\" (UniqueName: \"kubernetes.io/projected/49846891-c3bb-4413-a9ba-1d58fb45faf5-kube-api-access-zlds2\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.887838 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.887980 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.927162 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.952135 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 06 07:00:04 crc kubenswrapper[4845]: I1006 07:00:04.987269 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689c78bb4c-mlcf4"] Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.205525 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.244696 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.353095 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" podUID="199df9d9-f617-44a5-8afe-9e5d086c249b" containerName="dnsmasq-dns" containerID="cri-o://c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2" gracePeriod=10 Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.353527 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" event={"ID":"49846891-c3bb-4413-a9ba-1d58fb45faf5","Type":"ContainerDied","Data":"5c7683dc03e6215ad0482739e9b0161491d47a086ab31b0cc9e781a0709e2aea"} Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.353589 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c7683dc03e6215ad0482739e9b0161491d47a086ab31b0cc9e781a0709e2aea" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.353539 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.452018 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.456886 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.619449 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d4dcb88c5-4wtgc"] Oct 06 07:00:05 crc kubenswrapper[4845]: E1006 07:00:05.619819 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49846891-c3bb-4413-a9ba-1d58fb45faf5" containerName="collect-profiles" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.619836 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="49846891-c3bb-4413-a9ba-1d58fb45faf5" containerName="collect-profiles" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.620050 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="49846891-c3bb-4413-a9ba-1d58fb45faf5" containerName="collect-profiles" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.620976 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.624695 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.641324 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d4dcb88c5-4wtgc"] Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.708923 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-4qbxx"] Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.710237 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.711121 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-config\") pod \"dnsmasq-dns-6d4dcb88c5-4wtgc\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.711181 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-dns-svc\") pod \"dnsmasq-dns-6d4dcb88c5-4wtgc\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.711257 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz5f2\" (UniqueName: \"kubernetes.io/projected/789b27b6-350c-4372-9a9e-ac48734b7049-kube-api-access-nz5f2\") pod \"dnsmasq-dns-6d4dcb88c5-4wtgc\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.711298 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-ovsdbserver-sb\") pod \"dnsmasq-dns-6d4dcb88c5-4wtgc\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.713663 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.732730 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4qbxx"] Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.795457 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d4dcb88c5-4wtgc"] Oct 06 07:00:05 crc kubenswrapper[4845]: E1006 07:00:05.796169 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-nz5f2 ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" podUID="789b27b6-350c-4372-9a9e-ac48734b7049" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.812436 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz5f2\" (UniqueName: \"kubernetes.io/projected/789b27b6-350c-4372-9a9e-ac48734b7049-kube-api-access-nz5f2\") pod \"dnsmasq-dns-6d4dcb88c5-4wtgc\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.812508 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-ovsdbserver-sb\") pod \"dnsmasq-dns-6d4dcb88c5-4wtgc\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.812552 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jw4l\" (UniqueName: \"kubernetes.io/projected/175127a7-9d27-4976-a4bb-789072f8370c-kube-api-access-4jw4l\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.812614 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-config\") pod \"dnsmasq-dns-6d4dcb88c5-4wtgc\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.812636 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/175127a7-9d27-4976-a4bb-789072f8370c-ovn-rundir\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.812663 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175127a7-9d27-4976-a4bb-789072f8370c-config\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.812688 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/175127a7-9d27-4976-a4bb-789072f8370c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.812723 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-dns-svc\") pod \"dnsmasq-dns-6d4dcb88c5-4wtgc\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.812748 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/175127a7-9d27-4976-a4bb-789072f8370c-ovs-rundir\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.812794 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/175127a7-9d27-4976-a4bb-789072f8370c-combined-ca-bundle\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.813641 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-dns-svc\") pod \"dnsmasq-dns-6d4dcb88c5-4wtgc\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.813824 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-ovsdbserver-sb\") pod \"dnsmasq-dns-6d4dcb88c5-4wtgc\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.816490 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-config\") pod \"dnsmasq-dns-6d4dcb88c5-4wtgc\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.841076 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz5f2\" (UniqueName: \"kubernetes.io/projected/789b27b6-350c-4372-9a9e-ac48734b7049-kube-api-access-nz5f2\") pod \"dnsmasq-dns-6d4dcb88c5-4wtgc\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.842019 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.844195 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.850143 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.850945 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gs8kf" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.851219 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.851498 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.855620 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b49547ff-knnjb"] Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.857005 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.858546 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.866500 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b49547ff-knnjb"] Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.880632 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917157 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jw4l\" (UniqueName: \"kubernetes.io/projected/175127a7-9d27-4976-a4bb-789072f8370c-kube-api-access-4jw4l\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917206 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxww6\" (UniqueName: \"kubernetes.io/projected/bf5f7ffb-f69e-40fe-b8b7-157266325c88-kube-api-access-fxww6\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917250 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5f7ffb-f69e-40fe-b8b7-157266325c88-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917272 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/175127a7-9d27-4976-a4bb-789072f8370c-ovn-rundir\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917300 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175127a7-9d27-4976-a4bb-789072f8370c-config\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917320 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/175127a7-9d27-4976-a4bb-789072f8370c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917343 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/175127a7-9d27-4976-a4bb-789072f8370c-ovs-rundir\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917368 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5f7ffb-f69e-40fe-b8b7-157266325c88-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917407 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf5f7ffb-f69e-40fe-b8b7-157266325c88-scripts\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917424 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/175127a7-9d27-4976-a4bb-789072f8370c-combined-ca-bundle\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917439 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5f7ffb-f69e-40fe-b8b7-157266325c88-config\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917472 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf5f7ffb-f69e-40fe-b8b7-157266325c88-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.917491 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5f7ffb-f69e-40fe-b8b7-157266325c88-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.921234 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/175127a7-9d27-4976-a4bb-789072f8370c-ovs-rundir\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.921696 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/175127a7-9d27-4976-a4bb-789072f8370c-ovn-rundir\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.924608 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/175127a7-9d27-4976-a4bb-789072f8370c-combined-ca-bundle\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.924958 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175127a7-9d27-4976-a4bb-789072f8370c-config\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.925352 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/175127a7-9d27-4976-a4bb-789072f8370c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:05 crc kubenswrapper[4845]: I1006 07:00:05.952957 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jw4l\" (UniqueName: \"kubernetes.io/projected/175127a7-9d27-4976-a4bb-789072f8370c-kube-api-access-4jw4l\") pod \"ovn-controller-metrics-4qbxx\" (UID: \"175127a7-9d27-4976-a4bb-789072f8370c\") " pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.013725 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.018749 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5f7ffb-f69e-40fe-b8b7-157266325c88-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.018795 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-config\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.018843 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5f7ffb-f69e-40fe-b8b7-157266325c88-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.018869 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-ovsdbserver-sb\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.018890 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf5f7ffb-f69e-40fe-b8b7-157266325c88-scripts\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.018909 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5f7ffb-f69e-40fe-b8b7-157266325c88-config\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.018941 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf5f7ffb-f69e-40fe-b8b7-157266325c88-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.018965 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5f7ffb-f69e-40fe-b8b7-157266325c88-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.018989 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-dns-svc\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.019018 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-ovsdbserver-nb\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.019073 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxww6\" (UniqueName: \"kubernetes.io/projected/bf5f7ffb-f69e-40fe-b8b7-157266325c88-kube-api-access-fxww6\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.019096 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp78r\" (UniqueName: \"kubernetes.io/projected/0b2f8d23-5582-42f3-b050-caf5ad608dfd-kube-api-access-mp78r\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.020142 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5f7ffb-f69e-40fe-b8b7-157266325c88-config\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.020445 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf5f7ffb-f69e-40fe-b8b7-157266325c88-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.022106 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf5f7ffb-f69e-40fe-b8b7-157266325c88-scripts\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.024121 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5f7ffb-f69e-40fe-b8b7-157266325c88-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.024305 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5f7ffb-f69e-40fe-b8b7-157266325c88-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.027287 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5f7ffb-f69e-40fe-b8b7-157266325c88-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.036853 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4qbxx" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.039208 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxww6\" (UniqueName: \"kubernetes.io/projected/bf5f7ffb-f69e-40fe-b8b7-157266325c88-kube-api-access-fxww6\") pod \"ovn-northd-0\" (UID: \"bf5f7ffb-f69e-40fe-b8b7-157266325c88\") " pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.120147 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/199df9d9-f617-44a5-8afe-9e5d086c249b-dns-svc\") pod \"199df9d9-f617-44a5-8afe-9e5d086c249b\" (UID: \"199df9d9-f617-44a5-8afe-9e5d086c249b\") " Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.120183 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvqrr\" (UniqueName: \"kubernetes.io/projected/199df9d9-f617-44a5-8afe-9e5d086c249b-kube-api-access-rvqrr\") pod \"199df9d9-f617-44a5-8afe-9e5d086c249b\" (UID: \"199df9d9-f617-44a5-8afe-9e5d086c249b\") " Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.120243 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199df9d9-f617-44a5-8afe-9e5d086c249b-config\") pod \"199df9d9-f617-44a5-8afe-9e5d086c249b\" (UID: \"199df9d9-f617-44a5-8afe-9e5d086c249b\") " Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.120537 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-ovsdbserver-sb\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.120612 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-dns-svc\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.120633 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-ovsdbserver-nb\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.120681 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp78r\" (UniqueName: \"kubernetes.io/projected/0b2f8d23-5582-42f3-b050-caf5ad608dfd-kube-api-access-mp78r\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.120724 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-config\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.121411 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-config\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.122189 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-dns-svc\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.122706 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-ovsdbserver-nb\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.126252 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-ovsdbserver-sb\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.131502 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199df9d9-f617-44a5-8afe-9e5d086c249b-kube-api-access-rvqrr" (OuterVolumeSpecName: "kube-api-access-rvqrr") pod "199df9d9-f617-44a5-8afe-9e5d086c249b" (UID: "199df9d9-f617-44a5-8afe-9e5d086c249b"). InnerVolumeSpecName "kube-api-access-rvqrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.139683 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp78r\" (UniqueName: \"kubernetes.io/projected/0b2f8d23-5582-42f3-b050-caf5ad608dfd-kube-api-access-mp78r\") pod \"dnsmasq-dns-67b49547ff-knnjb\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.164635 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199df9d9-f617-44a5-8afe-9e5d086c249b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "199df9d9-f617-44a5-8afe-9e5d086c249b" (UID: "199df9d9-f617-44a5-8afe-9e5d086c249b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.166432 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199df9d9-f617-44a5-8afe-9e5d086c249b-config" (OuterVolumeSpecName: "config") pod "199df9d9-f617-44a5-8afe-9e5d086c249b" (UID: "199df9d9-f617-44a5-8afe-9e5d086c249b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.221984 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/199df9d9-f617-44a5-8afe-9e5d086c249b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.222019 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvqrr\" (UniqueName: \"kubernetes.io/projected/199df9d9-f617-44a5-8afe-9e5d086c249b-kube-api-access-rvqrr\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.222031 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199df9d9-f617-44a5-8afe-9e5d086c249b-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.299877 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.310501 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.365016 4845 generic.go:334] "Generic (PLEG): container finished" podID="199df9d9-f617-44a5-8afe-9e5d086c249b" containerID="c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2" exitCode=0 Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.365126 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.365172 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" event={"ID":"199df9d9-f617-44a5-8afe-9e5d086c249b","Type":"ContainerDied","Data":"c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2"} Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.365217 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689c78bb4c-mlcf4" event={"ID":"199df9d9-f617-44a5-8afe-9e5d086c249b","Type":"ContainerDied","Data":"a1cd41b7ab2bcde44012e083380cebac8833554c79e7a853d96fcd9a5b989777"} Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.365236 4845 scope.go:117] "RemoveContainer" containerID="c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.365388 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.382607 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.403339 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689c78bb4c-mlcf4"] Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.411555 4845 scope.go:117] "RemoveContainer" containerID="b4758d64fef3098f5ed37068f25299d3d54da4c7e15d729f88f20491ba6bfa58" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.411899 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-689c78bb4c-mlcf4"] Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.441123 4845 scope.go:117] "RemoveContainer" containerID="c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2" Oct 06 07:00:06 crc kubenswrapper[4845]: E1006 07:00:06.444718 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2\": container with ID starting with c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2 not found: ID does not exist" containerID="c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.444760 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2"} err="failed to get container status \"c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2\": rpc error: code = NotFound desc = could not find container \"c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2\": container with ID starting with c4b60a28088874146f65a177fd53f646a82847c972949561b39e1c2bdaba6bd2 not found: ID does not exist" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.444789 4845 scope.go:117] "RemoveContainer" containerID="b4758d64fef3098f5ed37068f25299d3d54da4c7e15d729f88f20491ba6bfa58" Oct 06 07:00:06 crc kubenswrapper[4845]: E1006 07:00:06.447594 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4758d64fef3098f5ed37068f25299d3d54da4c7e15d729f88f20491ba6bfa58\": container with ID starting with b4758d64fef3098f5ed37068f25299d3d54da4c7e15d729f88f20491ba6bfa58 not found: ID does not exist" containerID="b4758d64fef3098f5ed37068f25299d3d54da4c7e15d729f88f20491ba6bfa58" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.447654 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4758d64fef3098f5ed37068f25299d3d54da4c7e15d729f88f20491ba6bfa58"} err="failed to get container status \"b4758d64fef3098f5ed37068f25299d3d54da4c7e15d729f88f20491ba6bfa58\": rpc error: code = NotFound desc = could not find container \"b4758d64fef3098f5ed37068f25299d3d54da4c7e15d729f88f20491ba6bfa58\": container with ID starting with b4758d64fef3098f5ed37068f25299d3d54da4c7e15d729f88f20491ba6bfa58 not found: ID does not exist" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.495927 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4qbxx"] Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.528050 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-ovsdbserver-sb\") pod \"789b27b6-350c-4372-9a9e-ac48734b7049\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.529194 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "789b27b6-350c-4372-9a9e-ac48734b7049" (UID: "789b27b6-350c-4372-9a9e-ac48734b7049"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.529655 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-dns-svc\") pod \"789b27b6-350c-4372-9a9e-ac48734b7049\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.529747 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-config\") pod \"789b27b6-350c-4372-9a9e-ac48734b7049\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.529803 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz5f2\" (UniqueName: \"kubernetes.io/projected/789b27b6-350c-4372-9a9e-ac48734b7049-kube-api-access-nz5f2\") pod \"789b27b6-350c-4372-9a9e-ac48734b7049\" (UID: \"789b27b6-350c-4372-9a9e-ac48734b7049\") " Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.530643 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.532862 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "789b27b6-350c-4372-9a9e-ac48734b7049" (UID: "789b27b6-350c-4372-9a9e-ac48734b7049"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.533037 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-config" (OuterVolumeSpecName: "config") pod "789b27b6-350c-4372-9a9e-ac48734b7049" (UID: "789b27b6-350c-4372-9a9e-ac48734b7049"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.534369 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/789b27b6-350c-4372-9a9e-ac48734b7049-kube-api-access-nz5f2" (OuterVolumeSpecName: "kube-api-access-nz5f2") pod "789b27b6-350c-4372-9a9e-ac48734b7049" (UID: "789b27b6-350c-4372-9a9e-ac48734b7049"). InnerVolumeSpecName "kube-api-access-nz5f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.633616 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.633656 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/789b27b6-350c-4372-9a9e-ac48734b7049-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.633668 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz5f2\" (UniqueName: \"kubernetes.io/projected/789b27b6-350c-4372-9a9e-ac48734b7049-kube-api-access-nz5f2\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.776280 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 07:00:06 crc kubenswrapper[4845]: W1006 07:00:06.779861 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf5f7ffb_f69e_40fe_b8b7_157266325c88.slice/crio-183137f05dc1f6b5caaccc4dde6c879a79fca1f1b2a482eb53880f36acb00d2a WatchSource:0}: Error finding container 183137f05dc1f6b5caaccc4dde6c879a79fca1f1b2a482eb53880f36acb00d2a: Status 404 returned error can't find the container with id 183137f05dc1f6b5caaccc4dde6c879a79fca1f1b2a482eb53880f36acb00d2a Oct 06 07:00:06 crc kubenswrapper[4845]: I1006 07:00:06.838243 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b49547ff-knnjb"] Oct 06 07:00:06 crc kubenswrapper[4845]: W1006 07:00:06.839726 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b2f8d23_5582_42f3_b050_caf5ad608dfd.slice/crio-1b10536f7a3c408d7416532b52cc2a36600c728aa45f885c78a49c7ad6c0d3d4 WatchSource:0}: Error finding container 1b10536f7a3c408d7416532b52cc2a36600c728aa45f885c78a49c7ad6c0d3d4: Status 404 returned error can't find the container with id 1b10536f7a3c408d7416532b52cc2a36600c728aa45f885c78a49c7ad6c0d3d4 Oct 06 07:00:07 crc kubenswrapper[4845]: I1006 07:00:07.377296 4845 generic.go:334] "Generic (PLEG): container finished" podID="0b2f8d23-5582-42f3-b050-caf5ad608dfd" containerID="abf8df35049529fb330e60e49c4d5127663aa42ce0c131f643d3094bcc0cbeac" exitCode=0 Oct 06 07:00:07 crc kubenswrapper[4845]: I1006 07:00:07.377411 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b49547ff-knnjb" event={"ID":"0b2f8d23-5582-42f3-b050-caf5ad608dfd","Type":"ContainerDied","Data":"abf8df35049529fb330e60e49c4d5127663aa42ce0c131f643d3094bcc0cbeac"} Oct 06 07:00:07 crc kubenswrapper[4845]: I1006 07:00:07.377901 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b49547ff-knnjb" event={"ID":"0b2f8d23-5582-42f3-b050-caf5ad608dfd","Type":"ContainerStarted","Data":"1b10536f7a3c408d7416532b52cc2a36600c728aa45f885c78a49c7ad6c0d3d4"} Oct 06 07:00:07 crc kubenswrapper[4845]: I1006 07:00:07.379534 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4qbxx" event={"ID":"175127a7-9d27-4976-a4bb-789072f8370c","Type":"ContainerStarted","Data":"f53aac9d9d6189d172832d4e411b06c80f4bb46da256239b03c9e984ec7bfda0"} Oct 06 07:00:07 crc kubenswrapper[4845]: I1006 07:00:07.379565 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4qbxx" event={"ID":"175127a7-9d27-4976-a4bb-789072f8370c","Type":"ContainerStarted","Data":"12e8ecdd5686f6c8a724d024583b7787f7c5ea3d44cdb8e88d62b7e5571245af"} Oct 06 07:00:07 crc kubenswrapper[4845]: I1006 07:00:07.380852 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38d9a5cf-6de3-487c-a71c-374ca55ca525","Type":"ContainerStarted","Data":"7822a084cc8e1391a4f88232f9ad984b3d121bb9054ea9593f43c5d0942dc39c"} Oct 06 07:00:07 crc kubenswrapper[4845]: I1006 07:00:07.381774 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf5f7ffb-f69e-40fe-b8b7-157266325c88","Type":"ContainerStarted","Data":"183137f05dc1f6b5caaccc4dde6c879a79fca1f1b2a482eb53880f36acb00d2a"} Oct 06 07:00:07 crc kubenswrapper[4845]: I1006 07:00:07.382128 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4dcb88c5-4wtgc" Oct 06 07:00:07 crc kubenswrapper[4845]: I1006 07:00:07.450061 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-4qbxx" podStartSLOduration=2.450039196 podStartE2EDuration="2.450039196s" podCreationTimestamp="2025-10-06 07:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:00:07.436214038 +0000 UTC m=+891.950955076" watchObservedRunningTime="2025-10-06 07:00:07.450039196 +0000 UTC m=+891.964780224" Oct 06 07:00:07 crc kubenswrapper[4845]: I1006 07:00:07.487434 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d4dcb88c5-4wtgc"] Oct 06 07:00:07 crc kubenswrapper[4845]: I1006 07:00:07.492883 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d4dcb88c5-4wtgc"] Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.237298 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="199df9d9-f617-44a5-8afe-9e5d086c249b" path="/var/lib/kubelet/pods/199df9d9-f617-44a5-8afe-9e5d086c249b/volumes" Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.238266 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="789b27b6-350c-4372-9a9e-ac48734b7049" path="/var/lib/kubelet/pods/789b27b6-350c-4372-9a9e-ac48734b7049/volumes" Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.388510 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b49547ff-knnjb" event={"ID":"0b2f8d23-5582-42f3-b050-caf5ad608dfd","Type":"ContainerStarted","Data":"f6ad3dc98a1120a269ca30b7fe9907210db12e9f1d9664f17aeaddd9e82e75ef"} Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.388777 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.390316 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf5f7ffb-f69e-40fe-b8b7-157266325c88","Type":"ContainerStarted","Data":"42309898e8aa1a16fc4b2275640bbdffa900e4a7f890e6d18932e4c5cd14c215"} Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.390349 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf5f7ffb-f69e-40fe-b8b7-157266325c88","Type":"ContainerStarted","Data":"4c345855997c21c101532ce2f14d55609482e50d263c19745f4ffe9c848d3c77"} Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.390422 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.424904 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b49547ff-knnjb" podStartSLOduration=3.424886633 podStartE2EDuration="3.424886633s" podCreationTimestamp="2025-10-06 07:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:00:08.406702646 +0000 UTC m=+892.921443664" watchObservedRunningTime="2025-10-06 07:00:08.424886633 +0000 UTC m=+892.939627641" Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.428482 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.723164891 podStartE2EDuration="3.428465734s" podCreationTimestamp="2025-10-06 07:00:05 +0000 UTC" firstStartedPulling="2025-10-06 07:00:06.781996811 +0000 UTC m=+891.296737819" lastFinishedPulling="2025-10-06 07:00:07.487297654 +0000 UTC m=+892.002038662" observedRunningTime="2025-10-06 07:00:08.42077649 +0000 UTC m=+892.935517518" watchObservedRunningTime="2025-10-06 07:00:08.428465734 +0000 UTC m=+892.943206742" Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.629972 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.630024 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.682002 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.750130 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.750212 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 06 07:00:08 crc kubenswrapper[4845]: I1006 07:00:08.861972 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 06 07:00:09 crc kubenswrapper[4845]: I1006 07:00:09.437603 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 06 07:00:09 crc kubenswrapper[4845]: I1006 07:00:09.438458 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 06 07:00:10 crc kubenswrapper[4845]: I1006 07:00:10.981643 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.033564 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b49547ff-knnjb"] Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.033768 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b49547ff-knnjb" podUID="0b2f8d23-5582-42f3-b050-caf5ad608dfd" containerName="dnsmasq-dns" containerID="cri-o://f6ad3dc98a1120a269ca30b7fe9907210db12e9f1d9664f17aeaddd9e82e75ef" gracePeriod=10 Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.065142 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c7856c787-p69z9"] Oct 06 07:00:11 crc kubenswrapper[4845]: E1006 07:00:11.065473 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199df9d9-f617-44a5-8afe-9e5d086c249b" containerName="dnsmasq-dns" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.065488 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="199df9d9-f617-44a5-8afe-9e5d086c249b" containerName="dnsmasq-dns" Oct 06 07:00:11 crc kubenswrapper[4845]: E1006 07:00:11.065500 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199df9d9-f617-44a5-8afe-9e5d086c249b" containerName="init" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.065506 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="199df9d9-f617-44a5-8afe-9e5d086c249b" containerName="init" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.065674 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="199df9d9-f617-44a5-8afe-9e5d086c249b" containerName="dnsmasq-dns" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.066438 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.088144 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c7856c787-p69z9"] Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.113249 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-ovsdbserver-nb\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.113328 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dx4q\" (UniqueName: \"kubernetes.io/projected/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-kube-api-access-9dx4q\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.113462 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-dns-svc\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.113560 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-config\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.113724 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-ovsdbserver-sb\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.215221 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-config\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.215602 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-ovsdbserver-sb\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.215651 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-ovsdbserver-nb\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.215707 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dx4q\" (UniqueName: \"kubernetes.io/projected/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-kube-api-access-9dx4q\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.215733 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-dns-svc\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.216206 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-config\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.216456 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-ovsdbserver-sb\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.216592 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-ovsdbserver-nb\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.217596 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-dns-svc\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.235024 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dx4q\" (UniqueName: \"kubernetes.io/projected/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-kube-api-access-9dx4q\") pod \"dnsmasq-dns-c7856c787-p69z9\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.411231 4845 generic.go:334] "Generic (PLEG): container finished" podID="0b2f8d23-5582-42f3-b050-caf5ad608dfd" containerID="f6ad3dc98a1120a269ca30b7fe9907210db12e9f1d9664f17aeaddd9e82e75ef" exitCode=0 Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.411327 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b49547ff-knnjb" event={"ID":"0b2f8d23-5582-42f3-b050-caf5ad608dfd","Type":"ContainerDied","Data":"f6ad3dc98a1120a269ca30b7fe9907210db12e9f1d9664f17aeaddd9e82e75ef"} Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.411542 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b49547ff-knnjb" event={"ID":"0b2f8d23-5582-42f3-b050-caf5ad608dfd","Type":"ContainerDied","Data":"1b10536f7a3c408d7416532b52cc2a36600c728aa45f885c78a49c7ad6c0d3d4"} Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.411611 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b10536f7a3c408d7416532b52cc2a36600c728aa45f885c78a49c7ad6c0d3d4" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.431257 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.440409 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.522821 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-ovsdbserver-nb\") pod \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.522886 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp78r\" (UniqueName: \"kubernetes.io/projected/0b2f8d23-5582-42f3-b050-caf5ad608dfd-kube-api-access-mp78r\") pod \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.522932 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-ovsdbserver-sb\") pod \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.523082 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-config\") pod \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.523151 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-dns-svc\") pod \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\" (UID: \"0b2f8d23-5582-42f3-b050-caf5ad608dfd\") " Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.528275 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b2f8d23-5582-42f3-b050-caf5ad608dfd-kube-api-access-mp78r" (OuterVolumeSpecName: "kube-api-access-mp78r") pod "0b2f8d23-5582-42f3-b050-caf5ad608dfd" (UID: "0b2f8d23-5582-42f3-b050-caf5ad608dfd"). InnerVolumeSpecName "kube-api-access-mp78r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.558434 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0b2f8d23-5582-42f3-b050-caf5ad608dfd" (UID: "0b2f8d23-5582-42f3-b050-caf5ad608dfd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.560976 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0b2f8d23-5582-42f3-b050-caf5ad608dfd" (UID: "0b2f8d23-5582-42f3-b050-caf5ad608dfd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.568720 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-config" (OuterVolumeSpecName: "config") pod "0b2f8d23-5582-42f3-b050-caf5ad608dfd" (UID: "0b2f8d23-5582-42f3-b050-caf5ad608dfd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.573038 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0b2f8d23-5582-42f3-b050-caf5ad608dfd" (UID: "0b2f8d23-5582-42f3-b050-caf5ad608dfd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.625456 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.625484 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.625495 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp78r\" (UniqueName: \"kubernetes.io/projected/0b2f8d23-5582-42f3-b050-caf5ad608dfd-kube-api-access-mp78r\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.625504 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.625512 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b2f8d23-5582-42f3-b050-caf5ad608dfd-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:11 crc kubenswrapper[4845]: I1006 07:00:11.885176 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c7856c787-p69z9"] Oct 06 07:00:11 crc kubenswrapper[4845]: W1006 07:00:11.896915 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fb1d4cd_3f24_42a8_87d3_888654b8ac01.slice/crio-45b5820ae10fd18064c5dbbcd87f3ede0f63ff51ddb08af21c20638538be6b94 WatchSource:0}: Error finding container 45b5820ae10fd18064c5dbbcd87f3ede0f63ff51ddb08af21c20638538be6b94: Status 404 returned error can't find the container with id 45b5820ae10fd18064c5dbbcd87f3ede0f63ff51ddb08af21c20638538be6b94 Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.191341 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 06 07:00:12 crc kubenswrapper[4845]: E1006 07:00:12.191938 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2f8d23-5582-42f3-b050-caf5ad608dfd" containerName="dnsmasq-dns" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.191953 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2f8d23-5582-42f3-b050-caf5ad608dfd" containerName="dnsmasq-dns" Oct 06 07:00:12 crc kubenswrapper[4845]: E1006 07:00:12.191979 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2f8d23-5582-42f3-b050-caf5ad608dfd" containerName="init" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.191985 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2f8d23-5582-42f3-b050-caf5ad608dfd" containerName="init" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.192257 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b2f8d23-5582-42f3-b050-caf5ad608dfd" containerName="dnsmasq-dns" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.199031 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.200818 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-6gwmg" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.201085 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.202410 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.202581 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.241022 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.346042 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ede02a6f-9a89-4a1d-960d-10490334fbd7-lock\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.346104 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ede02a6f-9a89-4a1d-960d-10490334fbd7-cache\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.346190 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.346223 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.346246 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k447\" (UniqueName: \"kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-kube-api-access-2k447\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.419600 4845 generic.go:334] "Generic (PLEG): container finished" podID="4fb1d4cd-3f24-42a8-87d3-888654b8ac01" containerID="a3568b386c28d0fea94b185fc942cf040c3a2c15a8bdf5ac3d67e2c39d10cc5e" exitCode=0 Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.419651 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7856c787-p69z9" event={"ID":"4fb1d4cd-3f24-42a8-87d3-888654b8ac01","Type":"ContainerDied","Data":"a3568b386c28d0fea94b185fc942cf040c3a2c15a8bdf5ac3d67e2c39d10cc5e"} Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.419680 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7856c787-p69z9" event={"ID":"4fb1d4cd-3f24-42a8-87d3-888654b8ac01","Type":"ContainerStarted","Data":"45b5820ae10fd18064c5dbbcd87f3ede0f63ff51ddb08af21c20638538be6b94"} Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.419707 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b49547ff-knnjb" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.447949 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ede02a6f-9a89-4a1d-960d-10490334fbd7-lock\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.448029 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ede02a6f-9a89-4a1d-960d-10490334fbd7-cache\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.448142 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.448176 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.448215 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k447\" (UniqueName: \"kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-kube-api-access-2k447\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: E1006 07:00:12.448321 4845 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 07:00:12 crc kubenswrapper[4845]: E1006 07:00:12.448341 4845 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 07:00:12 crc kubenswrapper[4845]: E1006 07:00:12.448406 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift podName:ede02a6f-9a89-4a1d-960d-10490334fbd7 nodeName:}" failed. No retries permitted until 2025-10-06 07:00:12.948390706 +0000 UTC m=+897.463131714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift") pod "swift-storage-0" (UID: "ede02a6f-9a89-4a1d-960d-10490334fbd7") : configmap "swift-ring-files" not found Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.448504 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ede02a6f-9a89-4a1d-960d-10490334fbd7-lock\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.448726 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.448856 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ede02a6f-9a89-4a1d-960d-10490334fbd7-cache\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.465231 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b49547ff-knnjb"] Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.472469 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k447\" (UniqueName: \"kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-kube-api-access-2k447\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.473410 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b49547ff-knnjb"] Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.482296 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.741546 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-5lnrx"] Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.742515 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.744303 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.744539 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.745251 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.780536 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-5lnrx"] Oct 06 07:00:12 crc kubenswrapper[4845]: E1006 07:00:12.781207 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-gwg7v ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-5lnrx" podUID="ea50f878-3787-4fe1-adb0-ba16684aec21" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.788947 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-86cb9"] Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.792222 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.797109 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-5lnrx"] Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.821752 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-86cb9"] Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.853734 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwg7v\" (UniqueName: \"kubernetes.io/projected/ea50f878-3787-4fe1-adb0-ba16684aec21-kube-api-access-gwg7v\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.853799 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-swiftconf\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.853828 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea50f878-3787-4fe1-adb0-ba16684aec21-ring-data-devices\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.853861 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-combined-ca-bundle\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.853979 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea50f878-3787-4fe1-adb0-ba16684aec21-etc-swift\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.854097 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea50f878-3787-4fe1-adb0-ba16684aec21-scripts\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.854179 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-dispersionconf\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.956543 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955c6fb6-c2a5-48cd-8680-632f32157e5c-ring-data-devices\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.956622 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea50f878-3787-4fe1-adb0-ba16684aec21-scripts\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.956664 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-swiftconf\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.956700 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-dispersionconf\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.956739 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-dispersionconf\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.956796 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955c6fb6-c2a5-48cd-8680-632f32157e5c-etc-swift\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.956836 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.956860 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwg7v\" (UniqueName: \"kubernetes.io/projected/ea50f878-3787-4fe1-adb0-ba16684aec21-kube-api-access-gwg7v\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.956905 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955c6fb6-c2a5-48cd-8680-632f32157e5c-scripts\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.956947 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-combined-ca-bundle\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:12 crc kubenswrapper[4845]: E1006 07:00:12.956973 4845 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 07:00:12 crc kubenswrapper[4845]: E1006 07:00:12.956992 4845 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 07:00:12 crc kubenswrapper[4845]: E1006 07:00:12.957036 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift podName:ede02a6f-9a89-4a1d-960d-10490334fbd7 nodeName:}" failed. No retries permitted until 2025-10-06 07:00:13.957019308 +0000 UTC m=+898.471760426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift") pod "swift-storage-0" (UID: "ede02a6f-9a89-4a1d-960d-10490334fbd7") : configmap "swift-ring-files" not found Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.957053 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-swiftconf\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.957083 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rptd9\" (UniqueName: \"kubernetes.io/projected/955c6fb6-c2a5-48cd-8680-632f32157e5c-kube-api-access-rptd9\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.957130 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea50f878-3787-4fe1-adb0-ba16684aec21-ring-data-devices\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.957188 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-combined-ca-bundle\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.957220 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea50f878-3787-4fe1-adb0-ba16684aec21-etc-swift\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.957616 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea50f878-3787-4fe1-adb0-ba16684aec21-scripts\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.957668 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea50f878-3787-4fe1-adb0-ba16684aec21-etc-swift\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.957890 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea50f878-3787-4fe1-adb0-ba16684aec21-ring-data-devices\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.961596 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-swiftconf\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.966292 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-dispersionconf\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.967681 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-combined-ca-bundle\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:12 crc kubenswrapper[4845]: I1006 07:00:12.975413 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwg7v\" (UniqueName: \"kubernetes.io/projected/ea50f878-3787-4fe1-adb0-ba16684aec21-kube-api-access-gwg7v\") pod \"swift-ring-rebalance-5lnrx\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.058357 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rptd9\" (UniqueName: \"kubernetes.io/projected/955c6fb6-c2a5-48cd-8680-632f32157e5c-kube-api-access-rptd9\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.058476 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955c6fb6-c2a5-48cd-8680-632f32157e5c-ring-data-devices\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.058514 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-swiftconf\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.058552 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-dispersionconf\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.058577 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955c6fb6-c2a5-48cd-8680-632f32157e5c-etc-swift\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.058619 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955c6fb6-c2a5-48cd-8680-632f32157e5c-scripts\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.058639 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-combined-ca-bundle\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.058984 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955c6fb6-c2a5-48cd-8680-632f32157e5c-etc-swift\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.059220 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955c6fb6-c2a5-48cd-8680-632f32157e5c-ring-data-devices\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.059703 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955c6fb6-c2a5-48cd-8680-632f32157e5c-scripts\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.061621 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-swiftconf\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.061671 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-dispersionconf\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.062629 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-combined-ca-bundle\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.074400 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rptd9\" (UniqueName: \"kubernetes.io/projected/955c6fb6-c2a5-48cd-8680-632f32157e5c-kube-api-access-rptd9\") pod \"swift-ring-rebalance-86cb9\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.121515 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.429246 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.429462 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7856c787-p69z9" event={"ID":"4fb1d4cd-3f24-42a8-87d3-888654b8ac01","Type":"ContainerStarted","Data":"0680f345cdd85c5410198edfad14ddbfaf6ea7adc947019e44d90dad2e7213c0"} Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.429887 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.439360 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.454494 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c7856c787-p69z9" podStartSLOduration=2.454472749 podStartE2EDuration="2.454472749s" podCreationTimestamp="2025-10-06 07:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:00:13.44972914 +0000 UTC m=+897.964470158" watchObservedRunningTime="2025-10-06 07:00:13.454472749 +0000 UTC m=+897.969213767" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.565922 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-86cb9"] Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.572072 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwg7v\" (UniqueName: \"kubernetes.io/projected/ea50f878-3787-4fe1-adb0-ba16684aec21-kube-api-access-gwg7v\") pod \"ea50f878-3787-4fe1-adb0-ba16684aec21\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.572136 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-swiftconf\") pod \"ea50f878-3787-4fe1-adb0-ba16684aec21\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.572222 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-combined-ca-bundle\") pod \"ea50f878-3787-4fe1-adb0-ba16684aec21\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.572353 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea50f878-3787-4fe1-adb0-ba16684aec21-etc-swift\") pod \"ea50f878-3787-4fe1-adb0-ba16684aec21\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.572400 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea50f878-3787-4fe1-adb0-ba16684aec21-scripts\") pod \"ea50f878-3787-4fe1-adb0-ba16684aec21\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.572454 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-dispersionconf\") pod \"ea50f878-3787-4fe1-adb0-ba16684aec21\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.572476 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea50f878-3787-4fe1-adb0-ba16684aec21-ring-data-devices\") pod \"ea50f878-3787-4fe1-adb0-ba16684aec21\" (UID: \"ea50f878-3787-4fe1-adb0-ba16684aec21\") " Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.572735 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea50f878-3787-4fe1-adb0-ba16684aec21-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ea50f878-3787-4fe1-adb0-ba16684aec21" (UID: "ea50f878-3787-4fe1-adb0-ba16684aec21"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.573208 4845 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea50f878-3787-4fe1-adb0-ba16684aec21-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.573724 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea50f878-3787-4fe1-adb0-ba16684aec21-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ea50f878-3787-4fe1-adb0-ba16684aec21" (UID: "ea50f878-3787-4fe1-adb0-ba16684aec21"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.573859 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea50f878-3787-4fe1-adb0-ba16684aec21-scripts" (OuterVolumeSpecName: "scripts") pod "ea50f878-3787-4fe1-adb0-ba16684aec21" (UID: "ea50f878-3787-4fe1-adb0-ba16684aec21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.577237 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ea50f878-3787-4fe1-adb0-ba16684aec21" (UID: "ea50f878-3787-4fe1-adb0-ba16684aec21"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.577258 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea50f878-3787-4fe1-adb0-ba16684aec21" (UID: "ea50f878-3787-4fe1-adb0-ba16684aec21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.577577 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ea50f878-3787-4fe1-adb0-ba16684aec21" (UID: "ea50f878-3787-4fe1-adb0-ba16684aec21"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:00:13 crc kubenswrapper[4845]: W1006 07:00:13.578748 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod955c6fb6_c2a5_48cd_8680_632f32157e5c.slice/crio-ecb4a2fd2a873675e80f9ada6b31f9908ac2c04c2b81b7655ea86dc6706b09aa WatchSource:0}: Error finding container ecb4a2fd2a873675e80f9ada6b31f9908ac2c04c2b81b7655ea86dc6706b09aa: Status 404 returned error can't find the container with id ecb4a2fd2a873675e80f9ada6b31f9908ac2c04c2b81b7655ea86dc6706b09aa Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.579752 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea50f878-3787-4fe1-adb0-ba16684aec21-kube-api-access-gwg7v" (OuterVolumeSpecName: "kube-api-access-gwg7v") pod "ea50f878-3787-4fe1-adb0-ba16684aec21" (UID: "ea50f878-3787-4fe1-adb0-ba16684aec21"). InnerVolumeSpecName "kube-api-access-gwg7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.675692 4845 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea50f878-3787-4fe1-adb0-ba16684aec21-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.675763 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwg7v\" (UniqueName: \"kubernetes.io/projected/ea50f878-3787-4fe1-adb0-ba16684aec21-kube-api-access-gwg7v\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.675795 4845 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.675820 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.675848 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea50f878-3787-4fe1-adb0-ba16684aec21-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.675873 4845 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea50f878-3787-4fe1-adb0-ba16684aec21-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:13 crc kubenswrapper[4845]: I1006 07:00:13.980171 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:13 crc kubenswrapper[4845]: E1006 07:00:13.980341 4845 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 07:00:13 crc kubenswrapper[4845]: E1006 07:00:13.980391 4845 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 07:00:13 crc kubenswrapper[4845]: E1006 07:00:13.980454 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift podName:ede02a6f-9a89-4a1d-960d-10490334fbd7 nodeName:}" failed. No retries permitted until 2025-10-06 07:00:15.980436137 +0000 UTC m=+900.495177145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift") pod "swift-storage-0" (UID: "ede02a6f-9a89-4a1d-960d-10490334fbd7") : configmap "swift-ring-files" not found Oct 06 07:00:14 crc kubenswrapper[4845]: I1006 07:00:14.245241 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b2f8d23-5582-42f3-b050-caf5ad608dfd" path="/var/lib/kubelet/pods/0b2f8d23-5582-42f3-b050-caf5ad608dfd/volumes" Oct 06 07:00:14 crc kubenswrapper[4845]: I1006 07:00:14.442224 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ksn8k"] Oct 06 07:00:14 crc kubenswrapper[4845]: I1006 07:00:14.443451 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ksn8k" Oct 06 07:00:14 crc kubenswrapper[4845]: I1006 07:00:14.448558 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ksn8k"] Oct 06 07:00:14 crc kubenswrapper[4845]: I1006 07:00:14.450406 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-86cb9" event={"ID":"955c6fb6-c2a5-48cd-8680-632f32157e5c","Type":"ContainerStarted","Data":"ecb4a2fd2a873675e80f9ada6b31f9908ac2c04c2b81b7655ea86dc6706b09aa"} Oct 06 07:00:14 crc kubenswrapper[4845]: I1006 07:00:14.450434 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5lnrx" Oct 06 07:00:14 crc kubenswrapper[4845]: I1006 07:00:14.495236 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-5lnrx"] Oct 06 07:00:14 crc kubenswrapper[4845]: I1006 07:00:14.496070 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-5lnrx"] Oct 06 07:00:14 crc kubenswrapper[4845]: I1006 07:00:14.497826 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4bsg\" (UniqueName: \"kubernetes.io/projected/31afef42-8484-46bf-ab3e-be1b8eb4dd4a-kube-api-access-q4bsg\") pod \"glance-db-create-ksn8k\" (UID: \"31afef42-8484-46bf-ab3e-be1b8eb4dd4a\") " pod="openstack/glance-db-create-ksn8k" Oct 06 07:00:14 crc kubenswrapper[4845]: I1006 07:00:14.599418 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4bsg\" (UniqueName: \"kubernetes.io/projected/31afef42-8484-46bf-ab3e-be1b8eb4dd4a-kube-api-access-q4bsg\") pod \"glance-db-create-ksn8k\" (UID: \"31afef42-8484-46bf-ab3e-be1b8eb4dd4a\") " pod="openstack/glance-db-create-ksn8k" Oct 06 07:00:14 crc kubenswrapper[4845]: I1006 07:00:14.618505 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4bsg\" (UniqueName: \"kubernetes.io/projected/31afef42-8484-46bf-ab3e-be1b8eb4dd4a-kube-api-access-q4bsg\") pod \"glance-db-create-ksn8k\" (UID: \"31afef42-8484-46bf-ab3e-be1b8eb4dd4a\") " pod="openstack/glance-db-create-ksn8k" Oct 06 07:00:14 crc kubenswrapper[4845]: I1006 07:00:14.765966 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ksn8k" Oct 06 07:00:15 crc kubenswrapper[4845]: I1006 07:00:15.189692 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ksn8k"] Oct 06 07:00:15 crc kubenswrapper[4845]: W1006 07:00:15.198982 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31afef42_8484_46bf_ab3e_be1b8eb4dd4a.slice/crio-874f408147cd6cbafe257af78039e87d6903b25ec6aa57a703345e3248bd10ef WatchSource:0}: Error finding container 874f408147cd6cbafe257af78039e87d6903b25ec6aa57a703345e3248bd10ef: Status 404 returned error can't find the container with id 874f408147cd6cbafe257af78039e87d6903b25ec6aa57a703345e3248bd10ef Oct 06 07:00:15 crc kubenswrapper[4845]: I1006 07:00:15.461843 4845 generic.go:334] "Generic (PLEG): container finished" podID="31afef42-8484-46bf-ab3e-be1b8eb4dd4a" containerID="69fd1df320dce22c38062a718f785d3ff4f405588ba34bec3b404a1b9b8d147c" exitCode=0 Oct 06 07:00:15 crc kubenswrapper[4845]: I1006 07:00:15.461884 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ksn8k" event={"ID":"31afef42-8484-46bf-ab3e-be1b8eb4dd4a","Type":"ContainerDied","Data":"69fd1df320dce22c38062a718f785d3ff4f405588ba34bec3b404a1b9b8d147c"} Oct 06 07:00:15 crc kubenswrapper[4845]: I1006 07:00:15.461908 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ksn8k" event={"ID":"31afef42-8484-46bf-ab3e-be1b8eb4dd4a","Type":"ContainerStarted","Data":"874f408147cd6cbafe257af78039e87d6903b25ec6aa57a703345e3248bd10ef"} Oct 06 07:00:16 crc kubenswrapper[4845]: I1006 07:00:16.028459 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:16 crc kubenswrapper[4845]: E1006 07:00:16.028699 4845 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 07:00:16 crc kubenswrapper[4845]: E1006 07:00:16.028714 4845 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 07:00:16 crc kubenswrapper[4845]: E1006 07:00:16.028755 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift podName:ede02a6f-9a89-4a1d-960d-10490334fbd7 nodeName:}" failed. No retries permitted until 2025-10-06 07:00:20.028741324 +0000 UTC m=+904.543482332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift") pod "swift-storage-0" (UID: "ede02a6f-9a89-4a1d-960d-10490334fbd7") : configmap "swift-ring-files" not found Oct 06 07:00:16 crc kubenswrapper[4845]: I1006 07:00:16.239175 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea50f878-3787-4fe1-adb0-ba16684aec21" path="/var/lib/kubelet/pods/ea50f878-3787-4fe1-adb0-ba16684aec21/volumes" Oct 06 07:00:17 crc kubenswrapper[4845]: I1006 07:00:17.198040 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ksn8k" Oct 06 07:00:17 crc kubenswrapper[4845]: I1006 07:00:17.349659 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4bsg\" (UniqueName: \"kubernetes.io/projected/31afef42-8484-46bf-ab3e-be1b8eb4dd4a-kube-api-access-q4bsg\") pod \"31afef42-8484-46bf-ab3e-be1b8eb4dd4a\" (UID: \"31afef42-8484-46bf-ab3e-be1b8eb4dd4a\") " Oct 06 07:00:17 crc kubenswrapper[4845]: I1006 07:00:17.357742 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31afef42-8484-46bf-ab3e-be1b8eb4dd4a-kube-api-access-q4bsg" (OuterVolumeSpecName: "kube-api-access-q4bsg") pod "31afef42-8484-46bf-ab3e-be1b8eb4dd4a" (UID: "31afef42-8484-46bf-ab3e-be1b8eb4dd4a"). InnerVolumeSpecName "kube-api-access-q4bsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:17 crc kubenswrapper[4845]: I1006 07:00:17.451912 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4bsg\" (UniqueName: \"kubernetes.io/projected/31afef42-8484-46bf-ab3e-be1b8eb4dd4a-kube-api-access-q4bsg\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:17 crc kubenswrapper[4845]: I1006 07:00:17.484210 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-86cb9" event={"ID":"955c6fb6-c2a5-48cd-8680-632f32157e5c","Type":"ContainerStarted","Data":"b1641740da25bbb66a896b82f06baa31cf05be52a40716394c4727f3faae2c90"} Oct 06 07:00:17 crc kubenswrapper[4845]: I1006 07:00:17.486205 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ksn8k" event={"ID":"31afef42-8484-46bf-ab3e-be1b8eb4dd4a","Type":"ContainerDied","Data":"874f408147cd6cbafe257af78039e87d6903b25ec6aa57a703345e3248bd10ef"} Oct 06 07:00:17 crc kubenswrapper[4845]: I1006 07:00:17.486230 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="874f408147cd6cbafe257af78039e87d6903b25ec6aa57a703345e3248bd10ef" Oct 06 07:00:17 crc kubenswrapper[4845]: I1006 07:00:17.486270 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ksn8k" Oct 06 07:00:17 crc kubenswrapper[4845]: I1006 07:00:17.506275 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-86cb9" podStartSLOduration=1.861543884 podStartE2EDuration="5.506257582s" podCreationTimestamp="2025-10-06 07:00:12 +0000 UTC" firstStartedPulling="2025-10-06 07:00:13.580641954 +0000 UTC m=+898.095382982" lastFinishedPulling="2025-10-06 07:00:17.225355672 +0000 UTC m=+901.740096680" observedRunningTime="2025-10-06 07:00:17.499464441 +0000 UTC m=+902.014205459" watchObservedRunningTime="2025-10-06 07:00:17.506257582 +0000 UTC m=+902.020998590" Oct 06 07:00:18 crc kubenswrapper[4845]: I1006 07:00:18.720380 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-w87zr"] Oct 06 07:00:18 crc kubenswrapper[4845]: E1006 07:00:18.721165 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31afef42-8484-46bf-ab3e-be1b8eb4dd4a" containerName="mariadb-database-create" Oct 06 07:00:18 crc kubenswrapper[4845]: I1006 07:00:18.721183 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="31afef42-8484-46bf-ab3e-be1b8eb4dd4a" containerName="mariadb-database-create" Oct 06 07:00:18 crc kubenswrapper[4845]: I1006 07:00:18.721401 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="31afef42-8484-46bf-ab3e-be1b8eb4dd4a" containerName="mariadb-database-create" Oct 06 07:00:18 crc kubenswrapper[4845]: I1006 07:00:18.722078 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-w87zr" Oct 06 07:00:18 crc kubenswrapper[4845]: I1006 07:00:18.740912 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-w87zr"] Oct 06 07:00:18 crc kubenswrapper[4845]: I1006 07:00:18.773079 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z85j\" (UniqueName: \"kubernetes.io/projected/1dbb4b1f-cb80-4588-9871-c7df1d082077-kube-api-access-7z85j\") pod \"keystone-db-create-w87zr\" (UID: \"1dbb4b1f-cb80-4588-9871-c7df1d082077\") " pod="openstack/keystone-db-create-w87zr" Oct 06 07:00:18 crc kubenswrapper[4845]: I1006 07:00:18.875023 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z85j\" (UniqueName: \"kubernetes.io/projected/1dbb4b1f-cb80-4588-9871-c7df1d082077-kube-api-access-7z85j\") pod \"keystone-db-create-w87zr\" (UID: \"1dbb4b1f-cb80-4588-9871-c7df1d082077\") " pod="openstack/keystone-db-create-w87zr" Oct 06 07:00:18 crc kubenswrapper[4845]: I1006 07:00:18.900528 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z85j\" (UniqueName: \"kubernetes.io/projected/1dbb4b1f-cb80-4588-9871-c7df1d082077-kube-api-access-7z85j\") pod \"keystone-db-create-w87zr\" (UID: \"1dbb4b1f-cb80-4588-9871-c7df1d082077\") " pod="openstack/keystone-db-create-w87zr" Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.010675 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kf6m4"] Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.011657 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kf6m4" Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.029151 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kf6m4"] Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.036627 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-w87zr" Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.179761 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn28n\" (UniqueName: \"kubernetes.io/projected/59bbac9c-a7a6-485d-b796-ca047c8391c7-kube-api-access-hn28n\") pod \"placement-db-create-kf6m4\" (UID: \"59bbac9c-a7a6-485d-b796-ca047c8391c7\") " pod="openstack/placement-db-create-kf6m4" Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.281522 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn28n\" (UniqueName: \"kubernetes.io/projected/59bbac9c-a7a6-485d-b796-ca047c8391c7-kube-api-access-hn28n\") pod \"placement-db-create-kf6m4\" (UID: \"59bbac9c-a7a6-485d-b796-ca047c8391c7\") " pod="openstack/placement-db-create-kf6m4" Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.325561 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn28n\" (UniqueName: \"kubernetes.io/projected/59bbac9c-a7a6-485d-b796-ca047c8391c7-kube-api-access-hn28n\") pod \"placement-db-create-kf6m4\" (UID: \"59bbac9c-a7a6-485d-b796-ca047c8391c7\") " pod="openstack/placement-db-create-kf6m4" Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.355765 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kf6m4" Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.496839 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-w87zr"] Oct 06 07:00:19 crc kubenswrapper[4845]: W1006 07:00:19.514416 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dbb4b1f_cb80_4588_9871_c7df1d082077.slice/crio-5c09fdb2007fa618d0e0eea712ade93144a73349278857f51f1daa0d243b6199 WatchSource:0}: Error finding container 5c09fdb2007fa618d0e0eea712ade93144a73349278857f51f1daa0d243b6199: Status 404 returned error can't find the container with id 5c09fdb2007fa618d0e0eea712ade93144a73349278857f51f1daa0d243b6199 Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.847333 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kf6m4"] Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.858683 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w6t7s"] Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.860858 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.873856 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w6t7s"] Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.994054 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkpr5\" (UniqueName: \"kubernetes.io/projected/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-kube-api-access-pkpr5\") pod \"redhat-operators-w6t7s\" (UID: \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\") " pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.994433 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-utilities\") pod \"redhat-operators-w6t7s\" (UID: \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\") " pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:19 crc kubenswrapper[4845]: I1006 07:00:19.994546 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-catalog-content\") pod \"redhat-operators-w6t7s\" (UID: \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\") " pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.096411 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.096621 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkpr5\" (UniqueName: \"kubernetes.io/projected/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-kube-api-access-pkpr5\") pod \"redhat-operators-w6t7s\" (UID: \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\") " pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.096943 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-utilities\") pod \"redhat-operators-w6t7s\" (UID: \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\") " pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:20 crc kubenswrapper[4845]: E1006 07:00:20.096566 4845 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 07:00:20 crc kubenswrapper[4845]: E1006 07:00:20.096990 4845 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 07:00:20 crc kubenswrapper[4845]: E1006 07:00:20.097025 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift podName:ede02a6f-9a89-4a1d-960d-10490334fbd7 nodeName:}" failed. No retries permitted until 2025-10-06 07:00:28.097013193 +0000 UTC m=+912.611754201 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift") pod "swift-storage-0" (UID: "ede02a6f-9a89-4a1d-960d-10490334fbd7") : configmap "swift-ring-files" not found Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.097329 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-catalog-content\") pod \"redhat-operators-w6t7s\" (UID: \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\") " pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.097562 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-utilities\") pod \"redhat-operators-w6t7s\" (UID: \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\") " pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.098192 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-catalog-content\") pod \"redhat-operators-w6t7s\" (UID: \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\") " pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.115957 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkpr5\" (UniqueName: \"kubernetes.io/projected/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-kube-api-access-pkpr5\") pod \"redhat-operators-w6t7s\" (UID: \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\") " pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.208533 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.512129 4845 generic.go:334] "Generic (PLEG): container finished" podID="1dbb4b1f-cb80-4588-9871-c7df1d082077" containerID="de3986ecf7ebfda08bc8723317f9b9ac87dd96278e9df91892d3b3294d4c3029" exitCode=0 Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.512222 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-w87zr" event={"ID":"1dbb4b1f-cb80-4588-9871-c7df1d082077","Type":"ContainerDied","Data":"de3986ecf7ebfda08bc8723317f9b9ac87dd96278e9df91892d3b3294d4c3029"} Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.512247 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-w87zr" event={"ID":"1dbb4b1f-cb80-4588-9871-c7df1d082077","Type":"ContainerStarted","Data":"5c09fdb2007fa618d0e0eea712ade93144a73349278857f51f1daa0d243b6199"} Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.514122 4845 generic.go:334] "Generic (PLEG): container finished" podID="59bbac9c-a7a6-485d-b796-ca047c8391c7" containerID="7786d1de1fc95aec0e85df20eb485eeb8252c9e3a1ae49ab99de61886a7c60b3" exitCode=0 Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.514169 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kf6m4" event={"ID":"59bbac9c-a7a6-485d-b796-ca047c8391c7","Type":"ContainerDied","Data":"7786d1de1fc95aec0e85df20eb485eeb8252c9e3a1ae49ab99de61886a7c60b3"} Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.514184 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kf6m4" event={"ID":"59bbac9c-a7a6-485d-b796-ca047c8391c7","Type":"ContainerStarted","Data":"a1e51770bf73557b149b43fcee34fdeb87ab734cb59732f239088bb8b8b5349b"} Oct 06 07:00:20 crc kubenswrapper[4845]: I1006 07:00:20.694132 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w6t7s"] Oct 06 07:00:21 crc kubenswrapper[4845]: I1006 07:00:21.376107 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 06 07:00:21 crc kubenswrapper[4845]: I1006 07:00:21.482998 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:21 crc kubenswrapper[4845]: I1006 07:00:21.533360 4845 generic.go:334] "Generic (PLEG): container finished" podID="d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" containerID="83c2386fa20721148c15c3e439117b6c547ee49859468632965751c9e37ed9c6" exitCode=0 Oct 06 07:00:21 crc kubenswrapper[4845]: I1006 07:00:21.535572 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6t7s" event={"ID":"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43","Type":"ContainerDied","Data":"83c2386fa20721148c15c3e439117b6c547ee49859468632965751c9e37ed9c6"} Oct 06 07:00:21 crc kubenswrapper[4845]: I1006 07:00:21.535607 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6t7s" event={"ID":"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43","Type":"ContainerStarted","Data":"8753cdc2cb0fc38da025dea23e4e078c05fc8943c52c9224512c491c4083c3b3"} Oct 06 07:00:21 crc kubenswrapper[4845]: I1006 07:00:21.570701 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7495cbc78c-v5dbv"] Oct 06 07:00:21 crc kubenswrapper[4845]: I1006 07:00:21.570994 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" podUID="a12043d8-5d3d-4eb7-918e-c8b620d880ca" containerName="dnsmasq-dns" containerID="cri-o://7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed" gracePeriod=10 Oct 06 07:00:21 crc kubenswrapper[4845]: I1006 07:00:21.978091 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-w87zr" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.034671 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z85j\" (UniqueName: \"kubernetes.io/projected/1dbb4b1f-cb80-4588-9871-c7df1d082077-kube-api-access-7z85j\") pod \"1dbb4b1f-cb80-4588-9871-c7df1d082077\" (UID: \"1dbb4b1f-cb80-4588-9871-c7df1d082077\") " Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.041824 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dbb4b1f-cb80-4588-9871-c7df1d082077-kube-api-access-7z85j" (OuterVolumeSpecName: "kube-api-access-7z85j") pod "1dbb4b1f-cb80-4588-9871-c7df1d082077" (UID: "1dbb4b1f-cb80-4588-9871-c7df1d082077"). InnerVolumeSpecName "kube-api-access-7z85j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.110290 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kf6m4" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.123555 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.138526 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z85j\" (UniqueName: \"kubernetes.io/projected/1dbb4b1f-cb80-4588-9871-c7df1d082077-kube-api-access-7z85j\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.240691 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfsn4\" (UniqueName: \"kubernetes.io/projected/a12043d8-5d3d-4eb7-918e-c8b620d880ca-kube-api-access-rfsn4\") pod \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.240759 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn28n\" (UniqueName: \"kubernetes.io/projected/59bbac9c-a7a6-485d-b796-ca047c8391c7-kube-api-access-hn28n\") pod \"59bbac9c-a7a6-485d-b796-ca047c8391c7\" (UID: \"59bbac9c-a7a6-485d-b796-ca047c8391c7\") " Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.240833 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-dns-svc\") pod \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.240888 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-config\") pod \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.245456 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59bbac9c-a7a6-485d-b796-ca047c8391c7-kube-api-access-hn28n" (OuterVolumeSpecName: "kube-api-access-hn28n") pod "59bbac9c-a7a6-485d-b796-ca047c8391c7" (UID: "59bbac9c-a7a6-485d-b796-ca047c8391c7"). InnerVolumeSpecName "kube-api-access-hn28n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.256800 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a12043d8-5d3d-4eb7-918e-c8b620d880ca-kube-api-access-rfsn4" (OuterVolumeSpecName: "kube-api-access-rfsn4") pod "a12043d8-5d3d-4eb7-918e-c8b620d880ca" (UID: "a12043d8-5d3d-4eb7-918e-c8b620d880ca"). InnerVolumeSpecName "kube-api-access-rfsn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:22 crc kubenswrapper[4845]: E1006 07:00:22.295575 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-config podName:a12043d8-5d3d-4eb7-918e-c8b620d880ca nodeName:}" failed. No retries permitted until 2025-10-06 07:00:22.795539689 +0000 UTC m=+907.310280708 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-config") pod "a12043d8-5d3d-4eb7-918e-c8b620d880ca" (UID: "a12043d8-5d3d-4eb7-918e-c8b620d880ca") : error deleting /var/lib/kubelet/pods/a12043d8-5d3d-4eb7-918e-c8b620d880ca/volume-subpaths: remove /var/lib/kubelet/pods/a12043d8-5d3d-4eb7-918e-c8b620d880ca/volume-subpaths: no such file or directory Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.296097 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a12043d8-5d3d-4eb7-918e-c8b620d880ca" (UID: "a12043d8-5d3d-4eb7-918e-c8b620d880ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.346293 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfsn4\" (UniqueName: \"kubernetes.io/projected/a12043d8-5d3d-4eb7-918e-c8b620d880ca-kube-api-access-rfsn4\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.346448 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn28n\" (UniqueName: \"kubernetes.io/projected/59bbac9c-a7a6-485d-b796-ca047c8391c7-kube-api-access-hn28n\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.346514 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.545775 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6t7s" event={"ID":"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43","Type":"ContainerStarted","Data":"756de117249d73a6eaad0c7d787cee29b36ccd13b30f71e3c18d2c40dc1b5a24"} Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.549824 4845 generic.go:334] "Generic (PLEG): container finished" podID="a12043d8-5d3d-4eb7-918e-c8b620d880ca" containerID="7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed" exitCode=0 Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.549965 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" event={"ID":"a12043d8-5d3d-4eb7-918e-c8b620d880ca","Type":"ContainerDied","Data":"7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed"} Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.550047 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" event={"ID":"a12043d8-5d3d-4eb7-918e-c8b620d880ca","Type":"ContainerDied","Data":"edd4e0e13e0a3e51cb285f33d1eff520c280a600656b035e41a8f24f667cbc71"} Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.550125 4845 scope.go:117] "RemoveContainer" containerID="7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.550289 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7495cbc78c-v5dbv" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.568319 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-w87zr" event={"ID":"1dbb4b1f-cb80-4588-9871-c7df1d082077","Type":"ContainerDied","Data":"5c09fdb2007fa618d0e0eea712ade93144a73349278857f51f1daa0d243b6199"} Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.569028 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c09fdb2007fa618d0e0eea712ade93144a73349278857f51f1daa0d243b6199" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.568663 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-w87zr" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.571124 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kf6m4" event={"ID":"59bbac9c-a7a6-485d-b796-ca047c8391c7","Type":"ContainerDied","Data":"a1e51770bf73557b149b43fcee34fdeb87ab734cb59732f239088bb8b8b5349b"} Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.571196 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1e51770bf73557b149b43fcee34fdeb87ab734cb59732f239088bb8b8b5349b" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.571306 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kf6m4" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.591928 4845 scope.go:117] "RemoveContainer" containerID="8847c4e5abc6c866d6acd047931e122a9d7670c1599b52b4a88b34607c715f4b" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.614667 4845 scope.go:117] "RemoveContainer" containerID="7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed" Oct 06 07:00:22 crc kubenswrapper[4845]: E1006 07:00:22.615069 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed\": container with ID starting with 7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed not found: ID does not exist" containerID="7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.615099 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed"} err="failed to get container status \"7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed\": rpc error: code = NotFound desc = could not find container \"7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed\": container with ID starting with 7f2db1ed13e49921c64d456b06660f47b779b48a985685352e4261a6400cbeed not found: ID does not exist" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.615120 4845 scope.go:117] "RemoveContainer" containerID="8847c4e5abc6c866d6acd047931e122a9d7670c1599b52b4a88b34607c715f4b" Oct 06 07:00:22 crc kubenswrapper[4845]: E1006 07:00:22.615519 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8847c4e5abc6c866d6acd047931e122a9d7670c1599b52b4a88b34607c715f4b\": container with ID starting with 8847c4e5abc6c866d6acd047931e122a9d7670c1599b52b4a88b34607c715f4b not found: ID does not exist" containerID="8847c4e5abc6c866d6acd047931e122a9d7670c1599b52b4a88b34607c715f4b" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.615687 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8847c4e5abc6c866d6acd047931e122a9d7670c1599b52b4a88b34607c715f4b"} err="failed to get container status \"8847c4e5abc6c866d6acd047931e122a9d7670c1599b52b4a88b34607c715f4b\": rpc error: code = NotFound desc = could not find container \"8847c4e5abc6c866d6acd047931e122a9d7670c1599b52b4a88b34607c715f4b\": container with ID starting with 8847c4e5abc6c866d6acd047931e122a9d7670c1599b52b4a88b34607c715f4b not found: ID does not exist" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.855205 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-config\") pod \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\" (UID: \"a12043d8-5d3d-4eb7-918e-c8b620d880ca\") " Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.858498 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-config" (OuterVolumeSpecName: "config") pod "a12043d8-5d3d-4eb7-918e-c8b620d880ca" (UID: "a12043d8-5d3d-4eb7-918e-c8b620d880ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:22 crc kubenswrapper[4845]: I1006 07:00:22.958009 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12043d8-5d3d-4eb7-918e-c8b620d880ca-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:23 crc kubenswrapper[4845]: I1006 07:00:23.018745 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:00:23 crc kubenswrapper[4845]: I1006 07:00:23.018803 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:00:23 crc kubenswrapper[4845]: I1006 07:00:23.180214 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7495cbc78c-v5dbv"] Oct 06 07:00:23 crc kubenswrapper[4845]: I1006 07:00:23.187153 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7495cbc78c-v5dbv"] Oct 06 07:00:23 crc kubenswrapper[4845]: I1006 07:00:23.581002 4845 generic.go:334] "Generic (PLEG): container finished" podID="d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" containerID="756de117249d73a6eaad0c7d787cee29b36ccd13b30f71e3c18d2c40dc1b5a24" exitCode=0 Oct 06 07:00:23 crc kubenswrapper[4845]: I1006 07:00:23.581049 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6t7s" event={"ID":"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43","Type":"ContainerDied","Data":"756de117249d73a6eaad0c7d787cee29b36ccd13b30f71e3c18d2c40dc1b5a24"} Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.241812 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a12043d8-5d3d-4eb7-918e-c8b620d880ca" path="/var/lib/kubelet/pods/a12043d8-5d3d-4eb7-918e-c8b620d880ca/volumes" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.529903 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9389-account-create-rzx65"] Oct 06 07:00:24 crc kubenswrapper[4845]: E1006 07:00:24.530518 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12043d8-5d3d-4eb7-918e-c8b620d880ca" containerName="init" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.530554 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12043d8-5d3d-4eb7-918e-c8b620d880ca" containerName="init" Oct 06 07:00:24 crc kubenswrapper[4845]: E1006 07:00:24.530578 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dbb4b1f-cb80-4588-9871-c7df1d082077" containerName="mariadb-database-create" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.530594 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dbb4b1f-cb80-4588-9871-c7df1d082077" containerName="mariadb-database-create" Oct 06 07:00:24 crc kubenswrapper[4845]: E1006 07:00:24.530622 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12043d8-5d3d-4eb7-918e-c8b620d880ca" containerName="dnsmasq-dns" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.530637 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12043d8-5d3d-4eb7-918e-c8b620d880ca" containerName="dnsmasq-dns" Oct 06 07:00:24 crc kubenswrapper[4845]: E1006 07:00:24.530666 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bbac9c-a7a6-485d-b796-ca047c8391c7" containerName="mariadb-database-create" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.530679 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bbac9c-a7a6-485d-b796-ca047c8391c7" containerName="mariadb-database-create" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.530983 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12043d8-5d3d-4eb7-918e-c8b620d880ca" containerName="dnsmasq-dns" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.531028 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="59bbac9c-a7a6-485d-b796-ca047c8391c7" containerName="mariadb-database-create" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.531067 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dbb4b1f-cb80-4588-9871-c7df1d082077" containerName="mariadb-database-create" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.532084 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9389-account-create-rzx65" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.534363 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.540353 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9389-account-create-rzx65"] Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.585987 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94jzv\" (UniqueName: \"kubernetes.io/projected/641d0255-700b-41be-899e-e23f2129be3a-kube-api-access-94jzv\") pod \"glance-9389-account-create-rzx65\" (UID: \"641d0255-700b-41be-899e-e23f2129be3a\") " pod="openstack/glance-9389-account-create-rzx65" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.593816 4845 generic.go:334] "Generic (PLEG): container finished" podID="955c6fb6-c2a5-48cd-8680-632f32157e5c" containerID="b1641740da25bbb66a896b82f06baa31cf05be52a40716394c4727f3faae2c90" exitCode=0 Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.593900 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-86cb9" event={"ID":"955c6fb6-c2a5-48cd-8680-632f32157e5c","Type":"ContainerDied","Data":"b1641740da25bbb66a896b82f06baa31cf05be52a40716394c4727f3faae2c90"} Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.687346 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94jzv\" (UniqueName: \"kubernetes.io/projected/641d0255-700b-41be-899e-e23f2129be3a-kube-api-access-94jzv\") pod \"glance-9389-account-create-rzx65\" (UID: \"641d0255-700b-41be-899e-e23f2129be3a\") " pod="openstack/glance-9389-account-create-rzx65" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.706284 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94jzv\" (UniqueName: \"kubernetes.io/projected/641d0255-700b-41be-899e-e23f2129be3a-kube-api-access-94jzv\") pod \"glance-9389-account-create-rzx65\" (UID: \"641d0255-700b-41be-899e-e23f2129be3a\") " pod="openstack/glance-9389-account-create-rzx65" Oct 06 07:00:24 crc kubenswrapper[4845]: I1006 07:00:24.851447 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9389-account-create-rzx65" Oct 06 07:00:25 crc kubenswrapper[4845]: W1006 07:00:25.346617 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod641d0255_700b_41be_899e_e23f2129be3a.slice/crio-517ced4a8edfe810c1981a8a51055926be26c41e74aa74651691ba6076be760f WatchSource:0}: Error finding container 517ced4a8edfe810c1981a8a51055926be26c41e74aa74651691ba6076be760f: Status 404 returned error can't find the container with id 517ced4a8edfe810c1981a8a51055926be26c41e74aa74651691ba6076be760f Oct 06 07:00:25 crc kubenswrapper[4845]: I1006 07:00:25.346865 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9389-account-create-rzx65"] Oct 06 07:00:25 crc kubenswrapper[4845]: I1006 07:00:25.603706 4845 generic.go:334] "Generic (PLEG): container finished" podID="641d0255-700b-41be-899e-e23f2129be3a" containerID="08b5e9c4a09c775e386026fc80e8f24023c6f116024871d0cd0e5491281761b8" exitCode=0 Oct 06 07:00:25 crc kubenswrapper[4845]: I1006 07:00:25.603820 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9389-account-create-rzx65" event={"ID":"641d0255-700b-41be-899e-e23f2129be3a","Type":"ContainerDied","Data":"08b5e9c4a09c775e386026fc80e8f24023c6f116024871d0cd0e5491281761b8"} Oct 06 07:00:25 crc kubenswrapper[4845]: I1006 07:00:25.603876 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9389-account-create-rzx65" event={"ID":"641d0255-700b-41be-899e-e23f2129be3a","Type":"ContainerStarted","Data":"517ced4a8edfe810c1981a8a51055926be26c41e74aa74651691ba6076be760f"} Oct 06 07:00:25 crc kubenswrapper[4845]: I1006 07:00:25.607912 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6t7s" event={"ID":"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43","Type":"ContainerStarted","Data":"5d3c622e672c91c75953fd9c59fbfe018957cc7952f968fb33e91f09e227656f"} Oct 06 07:00:25 crc kubenswrapper[4845]: I1006 07:00:25.644738 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w6t7s" podStartSLOduration=3.490511644 podStartE2EDuration="6.644716856s" podCreationTimestamp="2025-10-06 07:00:19 +0000 UTC" firstStartedPulling="2025-10-06 07:00:21.537999862 +0000 UTC m=+906.052740880" lastFinishedPulling="2025-10-06 07:00:24.692205084 +0000 UTC m=+909.206946092" observedRunningTime="2025-10-06 07:00:25.641627696 +0000 UTC m=+910.156368714" watchObservedRunningTime="2025-10-06 07:00:25.644716856 +0000 UTC m=+910.159457884" Oct 06 07:00:25 crc kubenswrapper[4845]: I1006 07:00:25.938400 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.016315 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955c6fb6-c2a5-48cd-8680-632f32157e5c-ring-data-devices\") pod \"955c6fb6-c2a5-48cd-8680-632f32157e5c\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.016409 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-swiftconf\") pod \"955c6fb6-c2a5-48cd-8680-632f32157e5c\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.016459 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955c6fb6-c2a5-48cd-8680-632f32157e5c-scripts\") pod \"955c6fb6-c2a5-48cd-8680-632f32157e5c\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.016495 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rptd9\" (UniqueName: \"kubernetes.io/projected/955c6fb6-c2a5-48cd-8680-632f32157e5c-kube-api-access-rptd9\") pod \"955c6fb6-c2a5-48cd-8680-632f32157e5c\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.016579 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-dispersionconf\") pod \"955c6fb6-c2a5-48cd-8680-632f32157e5c\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.016636 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955c6fb6-c2a5-48cd-8680-632f32157e5c-etc-swift\") pod \"955c6fb6-c2a5-48cd-8680-632f32157e5c\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.016702 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-combined-ca-bundle\") pod \"955c6fb6-c2a5-48cd-8680-632f32157e5c\" (UID: \"955c6fb6-c2a5-48cd-8680-632f32157e5c\") " Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.017159 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955c6fb6-c2a5-48cd-8680-632f32157e5c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "955c6fb6-c2a5-48cd-8680-632f32157e5c" (UID: "955c6fb6-c2a5-48cd-8680-632f32157e5c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.017793 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955c6fb6-c2a5-48cd-8680-632f32157e5c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "955c6fb6-c2a5-48cd-8680-632f32157e5c" (UID: "955c6fb6-c2a5-48cd-8680-632f32157e5c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.022897 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/955c6fb6-c2a5-48cd-8680-632f32157e5c-kube-api-access-rptd9" (OuterVolumeSpecName: "kube-api-access-rptd9") pod "955c6fb6-c2a5-48cd-8680-632f32157e5c" (UID: "955c6fb6-c2a5-48cd-8680-632f32157e5c"). InnerVolumeSpecName "kube-api-access-rptd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.023990 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "955c6fb6-c2a5-48cd-8680-632f32157e5c" (UID: "955c6fb6-c2a5-48cd-8680-632f32157e5c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.038055 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955c6fb6-c2a5-48cd-8680-632f32157e5c-scripts" (OuterVolumeSpecName: "scripts") pod "955c6fb6-c2a5-48cd-8680-632f32157e5c" (UID: "955c6fb6-c2a5-48cd-8680-632f32157e5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.039824 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "955c6fb6-c2a5-48cd-8680-632f32157e5c" (UID: "955c6fb6-c2a5-48cd-8680-632f32157e5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.047538 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "955c6fb6-c2a5-48cd-8680-632f32157e5c" (UID: "955c6fb6-c2a5-48cd-8680-632f32157e5c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.119606 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/955c6fb6-c2a5-48cd-8680-632f32157e5c-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.119647 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rptd9\" (UniqueName: \"kubernetes.io/projected/955c6fb6-c2a5-48cd-8680-632f32157e5c-kube-api-access-rptd9\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.119664 4845 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.119678 4845 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/955c6fb6-c2a5-48cd-8680-632f32157e5c-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.119691 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.119704 4845 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/955c6fb6-c2a5-48cd-8680-632f32157e5c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.119715 4845 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/955c6fb6-c2a5-48cd-8680-632f32157e5c-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.618125 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-86cb9" event={"ID":"955c6fb6-c2a5-48cd-8680-632f32157e5c","Type":"ContainerDied","Data":"ecb4a2fd2a873675e80f9ada6b31f9908ac2c04c2b81b7655ea86dc6706b09aa"} Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.618171 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecb4a2fd2a873675e80f9ada6b31f9908ac2c04c2b81b7655ea86dc6706b09aa" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.618290 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-86cb9" Oct 06 07:00:26 crc kubenswrapper[4845]: I1006 07:00:26.949544 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9389-account-create-rzx65" Oct 06 07:00:27 crc kubenswrapper[4845]: I1006 07:00:27.033987 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94jzv\" (UniqueName: \"kubernetes.io/projected/641d0255-700b-41be-899e-e23f2129be3a-kube-api-access-94jzv\") pod \"641d0255-700b-41be-899e-e23f2129be3a\" (UID: \"641d0255-700b-41be-899e-e23f2129be3a\") " Oct 06 07:00:27 crc kubenswrapper[4845]: I1006 07:00:27.037389 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/641d0255-700b-41be-899e-e23f2129be3a-kube-api-access-94jzv" (OuterVolumeSpecName: "kube-api-access-94jzv") pod "641d0255-700b-41be-899e-e23f2129be3a" (UID: "641d0255-700b-41be-899e-e23f2129be3a"). InnerVolumeSpecName "kube-api-access-94jzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:27 crc kubenswrapper[4845]: I1006 07:00:27.136008 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94jzv\" (UniqueName: \"kubernetes.io/projected/641d0255-700b-41be-899e-e23f2129be3a-kube-api-access-94jzv\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:27 crc kubenswrapper[4845]: I1006 07:00:27.628747 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9389-account-create-rzx65" event={"ID":"641d0255-700b-41be-899e-e23f2129be3a","Type":"ContainerDied","Data":"517ced4a8edfe810c1981a8a51055926be26c41e74aa74651691ba6076be760f"} Oct 06 07:00:27 crc kubenswrapper[4845]: I1006 07:00:27.628791 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="517ced4a8edfe810c1981a8a51055926be26c41e74aa74651691ba6076be760f" Oct 06 07:00:27 crc kubenswrapper[4845]: I1006 07:00:27.628839 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9389-account-create-rzx65" Oct 06 07:00:28 crc kubenswrapper[4845]: I1006 07:00:28.151875 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:28 crc kubenswrapper[4845]: I1006 07:00:28.157527 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ede02a6f-9a89-4a1d-960d-10490334fbd7-etc-swift\") pod \"swift-storage-0\" (UID: \"ede02a6f-9a89-4a1d-960d-10490334fbd7\") " pod="openstack/swift-storage-0" Oct 06 07:00:28 crc kubenswrapper[4845]: I1006 07:00:28.453475 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 07:00:28 crc kubenswrapper[4845]: I1006 07:00:28.824149 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-a34d-account-create-d27s2"] Oct 06 07:00:28 crc kubenswrapper[4845]: E1006 07:00:28.824822 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="641d0255-700b-41be-899e-e23f2129be3a" containerName="mariadb-account-create" Oct 06 07:00:28 crc kubenswrapper[4845]: I1006 07:00:28.824835 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="641d0255-700b-41be-899e-e23f2129be3a" containerName="mariadb-account-create" Oct 06 07:00:28 crc kubenswrapper[4845]: E1006 07:00:28.824857 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="955c6fb6-c2a5-48cd-8680-632f32157e5c" containerName="swift-ring-rebalance" Oct 06 07:00:28 crc kubenswrapper[4845]: I1006 07:00:28.824863 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="955c6fb6-c2a5-48cd-8680-632f32157e5c" containerName="swift-ring-rebalance" Oct 06 07:00:28 crc kubenswrapper[4845]: I1006 07:00:28.825002 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="641d0255-700b-41be-899e-e23f2129be3a" containerName="mariadb-account-create" Oct 06 07:00:28 crc kubenswrapper[4845]: I1006 07:00:28.825019 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="955c6fb6-c2a5-48cd-8680-632f32157e5c" containerName="swift-ring-rebalance" Oct 06 07:00:28 crc kubenswrapper[4845]: I1006 07:00:28.825569 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a34d-account-create-d27s2" Oct 06 07:00:28 crc kubenswrapper[4845]: I1006 07:00:28.828763 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 06 07:00:28 crc kubenswrapper[4845]: I1006 07:00:28.836994 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a34d-account-create-d27s2"] Oct 06 07:00:28 crc kubenswrapper[4845]: I1006 07:00:28.968145 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4jj9\" (UniqueName: \"kubernetes.io/projected/59462ba4-763d-48f0-8d97-98113c788102-kube-api-access-n4jj9\") pod \"keystone-a34d-account-create-d27s2\" (UID: \"59462ba4-763d-48f0-8d97-98113c788102\") " pod="openstack/keystone-a34d-account-create-d27s2" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.000950 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.070699 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4jj9\" (UniqueName: \"kubernetes.io/projected/59462ba4-763d-48f0-8d97-98113c788102-kube-api-access-n4jj9\") pod \"keystone-a34d-account-create-d27s2\" (UID: \"59462ba4-763d-48f0-8d97-98113c788102\") " pod="openstack/keystone-a34d-account-create-d27s2" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.099190 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4jj9\" (UniqueName: \"kubernetes.io/projected/59462ba4-763d-48f0-8d97-98113c788102-kube-api-access-n4jj9\") pod \"keystone-a34d-account-create-d27s2\" (UID: \"59462ba4-763d-48f0-8d97-98113c788102\") " pod="openstack/keystone-a34d-account-create-d27s2" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.140019 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0f95-account-create-59gks"] Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.141120 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f95-account-create-59gks" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.144694 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.148882 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a34d-account-create-d27s2" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.150854 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0f95-account-create-59gks"] Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.273031 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnfdz\" (UniqueName: \"kubernetes.io/projected/ce3f5dd8-4bb8-4695-bd9b-5363016555f7-kube-api-access-lnfdz\") pod \"placement-0f95-account-create-59gks\" (UID: \"ce3f5dd8-4bb8-4695-bd9b-5363016555f7\") " pod="openstack/placement-0f95-account-create-59gks" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.338449 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-v4zd6" podUID="e2ee0908-39a9-4303-aad3-040a922d20a7" containerName="ovn-controller" probeResult="failure" output=< Oct 06 07:00:29 crc kubenswrapper[4845]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 06 07:00:29 crc kubenswrapper[4845]: > Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.357770 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.359846 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-n9jwg" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.376928 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnfdz\" (UniqueName: \"kubernetes.io/projected/ce3f5dd8-4bb8-4695-bd9b-5363016555f7-kube-api-access-lnfdz\") pod \"placement-0f95-account-create-59gks\" (UID: \"ce3f5dd8-4bb8-4695-bd9b-5363016555f7\") " pod="openstack/placement-0f95-account-create-59gks" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.409107 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnfdz\" (UniqueName: \"kubernetes.io/projected/ce3f5dd8-4bb8-4695-bd9b-5363016555f7-kube-api-access-lnfdz\") pod \"placement-0f95-account-create-59gks\" (UID: \"ce3f5dd8-4bb8-4695-bd9b-5363016555f7\") " pod="openstack/placement-0f95-account-create-59gks" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.461993 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f95-account-create-59gks" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.568310 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a34d-account-create-d27s2"] Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.576742 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v4zd6-config-ktrfq"] Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.577917 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: W1006 07:00:29.582759 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59462ba4_763d_48f0_8d97_98113c788102.slice/crio-c9c6efc9c2a5b768c569080c325aeb3544c1693b9d685b87598ff29478abc359 WatchSource:0}: Error finding container c9c6efc9c2a5b768c569080c325aeb3544c1693b9d685b87598ff29478abc359: Status 404 returned error can't find the container with id c9c6efc9c2a5b768c569080c325aeb3544c1693b9d685b87598ff29478abc359 Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.583507 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.588450 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v4zd6-config-ktrfq"] Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.652677 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"d02886226f1c30c664671c76cd36ef5fb226d99b98e795219c065c1cff600a15"} Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.654462 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a34d-account-create-d27s2" event={"ID":"59462ba4-763d-48f0-8d97-98113c788102","Type":"ContainerStarted","Data":"c9c6efc9c2a5b768c569080c325aeb3544c1693b9d685b87598ff29478abc359"} Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.681456 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d65q\" (UniqueName: \"kubernetes.io/projected/296fc233-07ef-4acf-a5a8-f0f9e865f583-kube-api-access-6d65q\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.681587 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-run\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.681633 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-run-ovn\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.681669 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-log-ovn\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.681724 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/296fc233-07ef-4acf-a5a8-f0f9e865f583-additional-scripts\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.681750 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/296fc233-07ef-4acf-a5a8-f0f9e865f583-scripts\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.736185 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-qnzpb"] Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.737337 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.739187 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kvff7" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.739298 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.750425 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qnzpb"] Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.783348 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-run\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.783466 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-run-ovn\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.783506 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-log-ovn\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.783555 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/296fc233-07ef-4acf-a5a8-f0f9e865f583-additional-scripts\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.783596 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/296fc233-07ef-4acf-a5a8-f0f9e865f583-scripts\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.783665 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d65q\" (UniqueName: \"kubernetes.io/projected/296fc233-07ef-4acf-a5a8-f0f9e865f583-kube-api-access-6d65q\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.783747 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-run\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.783793 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-run-ovn\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.783760 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-log-ovn\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.784752 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/296fc233-07ef-4acf-a5a8-f0f9e865f583-additional-scripts\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.785953 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/296fc233-07ef-4acf-a5a8-f0f9e865f583-scripts\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.803208 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d65q\" (UniqueName: \"kubernetes.io/projected/296fc233-07ef-4acf-a5a8-f0f9e865f583-kube-api-access-6d65q\") pod \"ovn-controller-v4zd6-config-ktrfq\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.885290 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-db-sync-config-data\") pod \"glance-db-sync-qnzpb\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.885684 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-combined-ca-bundle\") pod \"glance-db-sync-qnzpb\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.885962 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh6tl\" (UniqueName: \"kubernetes.io/projected/ef9aaef1-2966-4818-9f26-87bcf275907d-kube-api-access-vh6tl\") pod \"glance-db-sync-qnzpb\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.886180 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-config-data\") pod \"glance-db-sync-qnzpb\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.911100 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.937599 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0f95-account-create-59gks"] Oct 06 07:00:29 crc kubenswrapper[4845]: W1006 07:00:29.957703 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3f5dd8_4bb8_4695_bd9b_5363016555f7.slice/crio-a846d02e325ed6a78bb0a4948e316bcf9370dfce2712ea1542b2a3a790e817fa WatchSource:0}: Error finding container a846d02e325ed6a78bb0a4948e316bcf9370dfce2712ea1542b2a3a790e817fa: Status 404 returned error can't find the container with id a846d02e325ed6a78bb0a4948e316bcf9370dfce2712ea1542b2a3a790e817fa Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.987971 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-db-sync-config-data\") pod \"glance-db-sync-qnzpb\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.988037 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-combined-ca-bundle\") pod \"glance-db-sync-qnzpb\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.988080 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh6tl\" (UniqueName: \"kubernetes.io/projected/ef9aaef1-2966-4818-9f26-87bcf275907d-kube-api-access-vh6tl\") pod \"glance-db-sync-qnzpb\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.988134 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-config-data\") pod \"glance-db-sync-qnzpb\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.992555 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-combined-ca-bundle\") pod \"glance-db-sync-qnzpb\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.993930 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-db-sync-config-data\") pod \"glance-db-sync-qnzpb\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:29 crc kubenswrapper[4845]: I1006 07:00:29.994473 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-config-data\") pod \"glance-db-sync-qnzpb\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.014864 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh6tl\" (UniqueName: \"kubernetes.io/projected/ef9aaef1-2966-4818-9f26-87bcf275907d-kube-api-access-vh6tl\") pod \"glance-db-sync-qnzpb\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.063821 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.145654 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v4zd6-config-ktrfq"] Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.209102 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.209367 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.283333 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.623610 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qnzpb"] Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.671066 4845 generic.go:334] "Generic (PLEG): container finished" podID="6a669571-1ec3-4cb8-8a07-e20c31ca87e5" containerID="96190030a88ff80858e707644abc72592e8c49a60ec3b2a2aedfd6d04a82a1df" exitCode=0 Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.671135 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a669571-1ec3-4cb8-8a07-e20c31ca87e5","Type":"ContainerDied","Data":"96190030a88ff80858e707644abc72592e8c49a60ec3b2a2aedfd6d04a82a1df"} Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.674136 4845 generic.go:334] "Generic (PLEG): container finished" podID="59462ba4-763d-48f0-8d97-98113c788102" containerID="6e1ba7ad92ce6aae9c596c3573f4a6baf356f82601c253c7f0a387812a2288ac" exitCode=0 Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.674182 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a34d-account-create-d27s2" event={"ID":"59462ba4-763d-48f0-8d97-98113c788102","Type":"ContainerDied","Data":"6e1ba7ad92ce6aae9c596c3573f4a6baf356f82601c253c7f0a387812a2288ac"} Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.694400 4845 generic.go:334] "Generic (PLEG): container finished" podID="ce3f5dd8-4bb8-4695-bd9b-5363016555f7" containerID="44933b23a87e922443e622c8a72c3d3b56c3ac3b128ee3ae85a8ae43c1106d36" exitCode=0 Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.694487 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f95-account-create-59gks" event={"ID":"ce3f5dd8-4bb8-4695-bd9b-5363016555f7","Type":"ContainerDied","Data":"44933b23a87e922443e622c8a72c3d3b56c3ac3b128ee3ae85a8ae43c1106d36"} Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.694515 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f95-account-create-59gks" event={"ID":"ce3f5dd8-4bb8-4695-bd9b-5363016555f7","Type":"ContainerStarted","Data":"a846d02e325ed6a78bb0a4948e316bcf9370dfce2712ea1542b2a3a790e817fa"} Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.706623 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4zd6-config-ktrfq" event={"ID":"296fc233-07ef-4acf-a5a8-f0f9e865f583","Type":"ContainerStarted","Data":"3ccae0df7237bcb508d8c80ca8a9308e0b0d830a146e3163605d943100fb0e5f"} Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.706689 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4zd6-config-ktrfq" event={"ID":"296fc233-07ef-4acf-a5a8-f0f9e865f583","Type":"ContainerStarted","Data":"62125d47774487f98296fbabf57d2e18927748ac409afa9032dfc77212856e70"} Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.757972 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-v4zd6-config-ktrfq" podStartSLOduration=1.7579474290000001 podStartE2EDuration="1.757947429s" podCreationTimestamp="2025-10-06 07:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:00:30.755089246 +0000 UTC m=+915.269830274" watchObservedRunningTime="2025-10-06 07:00:30.757947429 +0000 UTC m=+915.272688437" Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.769134 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:30 crc kubenswrapper[4845]: I1006 07:00:30.814850 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w6t7s"] Oct 06 07:00:31 crc kubenswrapper[4845]: I1006 07:00:31.750225 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qnzpb" event={"ID":"ef9aaef1-2966-4818-9f26-87bcf275907d","Type":"ContainerStarted","Data":"c2360890984fb8d90bf76db37bd39aa38acd23db76d060c744a05497ba63a061"} Oct 06 07:00:31 crc kubenswrapper[4845]: I1006 07:00:31.761673 4845 generic.go:334] "Generic (PLEG): container finished" podID="296fc233-07ef-4acf-a5a8-f0f9e865f583" containerID="3ccae0df7237bcb508d8c80ca8a9308e0b0d830a146e3163605d943100fb0e5f" exitCode=0 Oct 06 07:00:31 crc kubenswrapper[4845]: I1006 07:00:31.761774 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4zd6-config-ktrfq" event={"ID":"296fc233-07ef-4acf-a5a8-f0f9e865f583","Type":"ContainerDied","Data":"3ccae0df7237bcb508d8c80ca8a9308e0b0d830a146e3163605d943100fb0e5f"} Oct 06 07:00:31 crc kubenswrapper[4845]: I1006 07:00:31.798703 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"7487ee47d16989eec7a3f866a2bea0c24dffb712f59ef675c536a80e93540eca"} Oct 06 07:00:31 crc kubenswrapper[4845]: I1006 07:00:31.798754 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"6e8b6fbb5c8339b51dbc65c7f7fd6f614816d368e58f72490d44d1681489b45b"} Oct 06 07:00:31 crc kubenswrapper[4845]: I1006 07:00:31.798764 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"688d66f8e09448b8bdd9b365a3b9608ee1ff1d152fa39ee058784286f32398e1"} Oct 06 07:00:31 crc kubenswrapper[4845]: I1006 07:00:31.798772 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"248b39fb19358fdfecc42ec3bea72bac79146f82863a0650bb668683bd0e8c66"} Oct 06 07:00:31 crc kubenswrapper[4845]: I1006 07:00:31.809892 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a669571-1ec3-4cb8-8a07-e20c31ca87e5","Type":"ContainerStarted","Data":"0ee17b80a502005f4a41fe516463e1629e18c66e11e5aa7120fc2e30bdcbc00e"} Oct 06 07:00:31 crc kubenswrapper[4845]: I1006 07:00:31.810657 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 07:00:31 crc kubenswrapper[4845]: I1006 07:00:31.852063 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.72313292 podStartE2EDuration="1m7.85204352s" podCreationTimestamp="2025-10-06 06:59:24 +0000 UTC" firstStartedPulling="2025-10-06 06:59:27.151911675 +0000 UTC m=+851.666652683" lastFinishedPulling="2025-10-06 06:59:55.280822285 +0000 UTC m=+879.795563283" observedRunningTime="2025-10-06 07:00:31.846812606 +0000 UTC m=+916.361553624" watchObservedRunningTime="2025-10-06 07:00:31.85204352 +0000 UTC m=+916.366784528" Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.429960 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a34d-account-create-d27s2" Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.441666 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f95-account-create-59gks" Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.535234 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4jj9\" (UniqueName: \"kubernetes.io/projected/59462ba4-763d-48f0-8d97-98113c788102-kube-api-access-n4jj9\") pod \"59462ba4-763d-48f0-8d97-98113c788102\" (UID: \"59462ba4-763d-48f0-8d97-98113c788102\") " Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.535272 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnfdz\" (UniqueName: \"kubernetes.io/projected/ce3f5dd8-4bb8-4695-bd9b-5363016555f7-kube-api-access-lnfdz\") pod \"ce3f5dd8-4bb8-4695-bd9b-5363016555f7\" (UID: \"ce3f5dd8-4bb8-4695-bd9b-5363016555f7\") " Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.542610 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59462ba4-763d-48f0-8d97-98113c788102-kube-api-access-n4jj9" (OuterVolumeSpecName: "kube-api-access-n4jj9") pod "59462ba4-763d-48f0-8d97-98113c788102" (UID: "59462ba4-763d-48f0-8d97-98113c788102"). InnerVolumeSpecName "kube-api-access-n4jj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.558021 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3f5dd8-4bb8-4695-bd9b-5363016555f7-kube-api-access-lnfdz" (OuterVolumeSpecName: "kube-api-access-lnfdz") pod "ce3f5dd8-4bb8-4695-bd9b-5363016555f7" (UID: "ce3f5dd8-4bb8-4695-bd9b-5363016555f7"). InnerVolumeSpecName "kube-api-access-lnfdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.636318 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4jj9\" (UniqueName: \"kubernetes.io/projected/59462ba4-763d-48f0-8d97-98113c788102-kube-api-access-n4jj9\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.636352 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnfdz\" (UniqueName: \"kubernetes.io/projected/ce3f5dd8-4bb8-4695-bd9b-5363016555f7-kube-api-access-lnfdz\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.818465 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f95-account-create-59gks" event={"ID":"ce3f5dd8-4bb8-4695-bd9b-5363016555f7","Type":"ContainerDied","Data":"a846d02e325ed6a78bb0a4948e316bcf9370dfce2712ea1542b2a3a790e817fa"} Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.818503 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a846d02e325ed6a78bb0a4948e316bcf9370dfce2712ea1542b2a3a790e817fa" Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.818551 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f95-account-create-59gks" Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.842667 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"e08de4b416a72069a9593352a53f509ecdcc0e144d1b12347eb05e96563f4f71"} Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.842721 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"453b2928e6c0f0dc030af117697b4c34f00dd2e68bf86e0dd025f787ef970d5a"} Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.846579 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a34d-account-create-d27s2" Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.846667 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a34d-account-create-d27s2" event={"ID":"59462ba4-763d-48f0-8d97-98113c788102","Type":"ContainerDied","Data":"c9c6efc9c2a5b768c569080c325aeb3544c1693b9d685b87598ff29478abc359"} Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.846763 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9c6efc9c2a5b768c569080c325aeb3544c1693b9d685b87598ff29478abc359" Oct 06 07:00:32 crc kubenswrapper[4845]: I1006 07:00:32.846790 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w6t7s" podUID="d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" containerName="registry-server" containerID="cri-o://5d3c622e672c91c75953fd9c59fbfe018957cc7952f968fb33e91f09e227656f" gracePeriod=2 Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.195848 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.373077 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/296fc233-07ef-4acf-a5a8-f0f9e865f583-scripts\") pod \"296fc233-07ef-4acf-a5a8-f0f9e865f583\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.373113 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-log-ovn\") pod \"296fc233-07ef-4acf-a5a8-f0f9e865f583\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.373195 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "296fc233-07ef-4acf-a5a8-f0f9e865f583" (UID: "296fc233-07ef-4acf-a5a8-f0f9e865f583"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.373143 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/296fc233-07ef-4acf-a5a8-f0f9e865f583-additional-scripts\") pod \"296fc233-07ef-4acf-a5a8-f0f9e865f583\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.374239 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/296fc233-07ef-4acf-a5a8-f0f9e865f583-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "296fc233-07ef-4acf-a5a8-f0f9e865f583" (UID: "296fc233-07ef-4acf-a5a8-f0f9e865f583"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.374253 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/296fc233-07ef-4acf-a5a8-f0f9e865f583-scripts" (OuterVolumeSpecName: "scripts") pod "296fc233-07ef-4acf-a5a8-f0f9e865f583" (UID: "296fc233-07ef-4acf-a5a8-f0f9e865f583"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.374315 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d65q\" (UniqueName: \"kubernetes.io/projected/296fc233-07ef-4acf-a5a8-f0f9e865f583-kube-api-access-6d65q\") pod \"296fc233-07ef-4acf-a5a8-f0f9e865f583\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.374441 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-run-ovn\") pod \"296fc233-07ef-4acf-a5a8-f0f9e865f583\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.374541 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-run\") pod \"296fc233-07ef-4acf-a5a8-f0f9e865f583\" (UID: \"296fc233-07ef-4acf-a5a8-f0f9e865f583\") " Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.374871 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/296fc233-07ef-4acf-a5a8-f0f9e865f583-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.374884 4845 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.374895 4845 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/296fc233-07ef-4acf-a5a8-f0f9e865f583-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.374922 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-run" (OuterVolumeSpecName: "var-run") pod "296fc233-07ef-4acf-a5a8-f0f9e865f583" (UID: "296fc233-07ef-4acf-a5a8-f0f9e865f583"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.374909 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "296fc233-07ef-4acf-a5a8-f0f9e865f583" (UID: "296fc233-07ef-4acf-a5a8-f0f9e865f583"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.378920 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296fc233-07ef-4acf-a5a8-f0f9e865f583-kube-api-access-6d65q" (OuterVolumeSpecName: "kube-api-access-6d65q") pod "296fc233-07ef-4acf-a5a8-f0f9e865f583" (UID: "296fc233-07ef-4acf-a5a8-f0f9e865f583"). InnerVolumeSpecName "kube-api-access-6d65q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.476319 4845 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.476366 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d65q\" (UniqueName: \"kubernetes.io/projected/296fc233-07ef-4acf-a5a8-f0f9e865f583-kube-api-access-6d65q\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.476465 4845 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/296fc233-07ef-4acf-a5a8-f0f9e865f583-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.860271 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"1a8b28e77c5ef4f4e9de4412a25bc8bd421e80556934e3c45b69f00b0e14c2ce"} Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.860326 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"b50936a669ff5970a5cd59c742f97fd3283f8bef2da4925ab461aaedeea723d0"} Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.870419 4845 generic.go:334] "Generic (PLEG): container finished" podID="d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" containerID="5d3c622e672c91c75953fd9c59fbfe018957cc7952f968fb33e91f09e227656f" exitCode=0 Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.870513 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6t7s" event={"ID":"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43","Type":"ContainerDied","Data":"5d3c622e672c91c75953fd9c59fbfe018957cc7952f968fb33e91f09e227656f"} Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.873718 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4zd6-config-ktrfq" event={"ID":"296fc233-07ef-4acf-a5a8-f0f9e865f583","Type":"ContainerDied","Data":"62125d47774487f98296fbabf57d2e18927748ac409afa9032dfc77212856e70"} Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.873760 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62125d47774487f98296fbabf57d2e18927748ac409afa9032dfc77212856e70" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.873772 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4zd6-config-ktrfq" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.890114 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-v4zd6-config-ktrfq"] Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.898845 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-v4zd6-config-ktrfq"] Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.945078 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v4zd6-config-wz9m6"] Oct 06 07:00:33 crc kubenswrapper[4845]: E1006 07:00:33.945436 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59462ba4-763d-48f0-8d97-98113c788102" containerName="mariadb-account-create" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.945451 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="59462ba4-763d-48f0-8d97-98113c788102" containerName="mariadb-account-create" Oct 06 07:00:33 crc kubenswrapper[4845]: E1006 07:00:33.945485 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3f5dd8-4bb8-4695-bd9b-5363016555f7" containerName="mariadb-account-create" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.945493 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3f5dd8-4bb8-4695-bd9b-5363016555f7" containerName="mariadb-account-create" Oct 06 07:00:33 crc kubenswrapper[4845]: E1006 07:00:33.945503 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296fc233-07ef-4acf-a5a8-f0f9e865f583" containerName="ovn-config" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.945509 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="296fc233-07ef-4acf-a5a8-f0f9e865f583" containerName="ovn-config" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.945678 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="59462ba4-763d-48f0-8d97-98113c788102" containerName="mariadb-account-create" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.945712 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="296fc233-07ef-4acf-a5a8-f0f9e865f583" containerName="ovn-config" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.945729 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3f5dd8-4bb8-4695-bd9b-5363016555f7" containerName="mariadb-account-create" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.946357 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.948365 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.971085 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v4zd6-config-wz9m6"] Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.991084 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-log-ovn\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.991200 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-run\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.991238 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-run-ovn\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.991273 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-scripts\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.991300 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-additional-scripts\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:33 crc kubenswrapper[4845]: I1006 07:00:33.991341 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzg9w\" (UniqueName: \"kubernetes.io/projected/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-kube-api-access-nzg9w\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.092772 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-log-ovn\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.092881 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-run\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.092915 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-run-ovn\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.092948 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-scripts\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.092974 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-additional-scripts\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.093011 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzg9w\" (UniqueName: \"kubernetes.io/projected/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-kube-api-access-nzg9w\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.093221 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-run-ovn\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.093221 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-log-ovn\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.094544 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-additional-scripts\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.095206 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-scripts\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.095271 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-run\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.135240 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzg9w\" (UniqueName: \"kubernetes.io/projected/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-kube-api-access-nzg9w\") pod \"ovn-controller-v4zd6-config-wz9m6\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.236305 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="296fc233-07ef-4acf-a5a8-f0f9e865f583" path="/var/lib/kubelet/pods/296fc233-07ef-4acf-a5a8-f0f9e865f583/volumes" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.270338 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.348621 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-v4zd6" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.470471 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.512140 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-utilities\") pod \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\" (UID: \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\") " Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.512365 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkpr5\" (UniqueName: \"kubernetes.io/projected/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-kube-api-access-pkpr5\") pod \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\" (UID: \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\") " Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.512403 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-catalog-content\") pod \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\" (UID: \"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43\") " Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.513574 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-utilities" (OuterVolumeSpecName: "utilities") pod "d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" (UID: "d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.519010 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-kube-api-access-pkpr5" (OuterVolumeSpecName: "kube-api-access-pkpr5") pod "d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" (UID: "d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43"). InnerVolumeSpecName "kube-api-access-pkpr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.600098 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v4zd6-config-wz9m6"] Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.613862 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkpr5\" (UniqueName: \"kubernetes.io/projected/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-kube-api-access-pkpr5\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.613898 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:34 crc kubenswrapper[4845]: W1006 07:00:34.614185 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podceeb6037_057a_47d1_94ea_44dbcedb3d1b.slice/crio-7356b2c875262f16311860619254f9699777bfc8b387c96c2aaf9b642edaafec WatchSource:0}: Error finding container 7356b2c875262f16311860619254f9699777bfc8b387c96c2aaf9b642edaafec: Status 404 returned error can't find the container with id 7356b2c875262f16311860619254f9699777bfc8b387c96c2aaf9b642edaafec Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.633359 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" (UID: "d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.715083 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.881810 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4zd6-config-wz9m6" event={"ID":"ceeb6037-057a-47d1-94ea-44dbcedb3d1b","Type":"ContainerStarted","Data":"7356b2c875262f16311860619254f9699777bfc8b387c96c2aaf9b642edaafec"} Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.886115 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6t7s" event={"ID":"d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43","Type":"ContainerDied","Data":"8753cdc2cb0fc38da025dea23e4e078c05fc8943c52c9224512c491c4083c3b3"} Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.886159 4845 scope.go:117] "RemoveContainer" containerID="5d3c622e672c91c75953fd9c59fbfe018957cc7952f968fb33e91f09e227656f" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.886190 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6t7s" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.929039 4845 scope.go:117] "RemoveContainer" containerID="756de117249d73a6eaad0c7d787cee29b36ccd13b30f71e3c18d2c40dc1b5a24" Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.936240 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w6t7s"] Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.945483 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w6t7s"] Oct 06 07:00:34 crc kubenswrapper[4845]: I1006 07:00:34.963864 4845 scope.go:117] "RemoveContainer" containerID="83c2386fa20721148c15c3e439117b6c547ee49859468632965751c9e37ed9c6" Oct 06 07:00:35 crc kubenswrapper[4845]: I1006 07:00:35.898666 4845 generic.go:334] "Generic (PLEG): container finished" podID="ceeb6037-057a-47d1-94ea-44dbcedb3d1b" containerID="188c1f227b38688df28380b75a2bfaa3f052af97efb3b86e7ccd34f27a80e993" exitCode=0 Oct 06 07:00:35 crc kubenswrapper[4845]: I1006 07:00:35.898749 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4zd6-config-wz9m6" event={"ID":"ceeb6037-057a-47d1-94ea-44dbcedb3d1b","Type":"ContainerDied","Data":"188c1f227b38688df28380b75a2bfaa3f052af97efb3b86e7ccd34f27a80e993"} Oct 06 07:00:35 crc kubenswrapper[4845]: I1006 07:00:35.905605 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"66cd89be6a6cad10662a3b129993f5df22a7f52688dc51481223a9747de666d9"} Oct 06 07:00:35 crc kubenswrapper[4845]: I1006 07:00:35.905661 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"4d95bc9016cbb85aea3a65073536a16b00d7d5cb7873ca896a1b3bf1fe136c7b"} Oct 06 07:00:36 crc kubenswrapper[4845]: I1006 07:00:36.237904 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" path="/var/lib/kubelet/pods/d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43/volumes" Oct 06 07:00:36 crc kubenswrapper[4845]: I1006 07:00:36.931916 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"c98ac1fe46bdcad53257f14eec5f4c18f5b90924bb42a03cd800e427528af9e7"} Oct 06 07:00:36 crc kubenswrapper[4845]: I1006 07:00:36.931975 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"9af3156fd10c38e4567743585dbba2ad3607d8bff1011246252c5c21e6509411"} Oct 06 07:00:36 crc kubenswrapper[4845]: I1006 07:00:36.931989 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"f005f7d52033e01222078a8104705bf048172f1f264d095443cf7115a57132e6"} Oct 06 07:00:36 crc kubenswrapper[4845]: I1006 07:00:36.932002 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"78fc5e6295ce2d7cf8301d604b4704a6282e9eba849fc996d5385d6243210372"} Oct 06 07:00:36 crc kubenswrapper[4845]: I1006 07:00:36.932041 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ede02a6f-9a89-4a1d-960d-10490334fbd7","Type":"ContainerStarted","Data":"9611ecb509d1caef6532cedbd8458a3bd60838c398b45f995b3d0c7c56a77b7f"} Oct 06 07:00:36 crc kubenswrapper[4845]: I1006 07:00:36.975112 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.580982129 podStartE2EDuration="25.975094006s" podCreationTimestamp="2025-10-06 07:00:11 +0000 UTC" firstStartedPulling="2025-10-06 07:00:29.009516533 +0000 UTC m=+913.524257551" lastFinishedPulling="2025-10-06 07:00:35.40362841 +0000 UTC m=+919.918369428" observedRunningTime="2025-10-06 07:00:36.965666864 +0000 UTC m=+921.480407892" watchObservedRunningTime="2025-10-06 07:00:36.975094006 +0000 UTC m=+921.489835014" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.246238 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-775fbffbc7-qzpjj"] Oct 06 07:00:37 crc kubenswrapper[4845]: E1006 07:00:37.246908 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" containerName="extract-utilities" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.246924 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" containerName="extract-utilities" Oct 06 07:00:37 crc kubenswrapper[4845]: E1006 07:00:37.246945 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" containerName="extract-content" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.246951 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" containerName="extract-content" Oct 06 07:00:37 crc kubenswrapper[4845]: E1006 07:00:37.246968 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" containerName="registry-server" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.246974 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" containerName="registry-server" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.247130 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a0e64c-9b5c-4c42-9fdc-044ad3d2bf43" containerName="registry-server" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.249291 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.253014 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.262792 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-775fbffbc7-qzpjj"] Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.282143 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.351821 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-ovsdbserver-sb\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.351917 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-dns-swift-storage-0\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.351981 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-ovsdbserver-nb\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.352005 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-config\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.352059 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-dns-svc\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.352080 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frfnk\" (UniqueName: \"kubernetes.io/projected/c4678683-6339-4d33-8297-76bfb9f58bb0-kube-api-access-frfnk\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.452932 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-log-ovn\") pod \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453018 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzg9w\" (UniqueName: \"kubernetes.io/projected/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-kube-api-access-nzg9w\") pod \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453059 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-run-ovn\") pod \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453074 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ceeb6037-057a-47d1-94ea-44dbcedb3d1b" (UID: "ceeb6037-057a-47d1-94ea-44dbcedb3d1b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453115 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-scripts\") pod \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453136 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-additional-scripts\") pod \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453142 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ceeb6037-057a-47d1-94ea-44dbcedb3d1b" (UID: "ceeb6037-057a-47d1-94ea-44dbcedb3d1b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453194 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-run\") pod \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\" (UID: \"ceeb6037-057a-47d1-94ea-44dbcedb3d1b\") " Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453341 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-run" (OuterVolumeSpecName: "var-run") pod "ceeb6037-057a-47d1-94ea-44dbcedb3d1b" (UID: "ceeb6037-057a-47d1-94ea-44dbcedb3d1b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453388 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-ovsdbserver-nb\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453471 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-config\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453560 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-dns-svc\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453603 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frfnk\" (UniqueName: \"kubernetes.io/projected/c4678683-6339-4d33-8297-76bfb9f58bb0-kube-api-access-frfnk\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453716 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-ovsdbserver-sb\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453830 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-dns-swift-storage-0\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453954 4845 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453964 4845 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.453974 4845 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.454123 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-ovsdbserver-nb\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.454406 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-dns-svc\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.454423 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-config\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.454783 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ceeb6037-057a-47d1-94ea-44dbcedb3d1b" (UID: "ceeb6037-057a-47d1-94ea-44dbcedb3d1b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.454839 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-ovsdbserver-sb\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.455474 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-scripts" (OuterVolumeSpecName: "scripts") pod "ceeb6037-057a-47d1-94ea-44dbcedb3d1b" (UID: "ceeb6037-057a-47d1-94ea-44dbcedb3d1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.455842 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-dns-swift-storage-0\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.460901 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-kube-api-access-nzg9w" (OuterVolumeSpecName: "kube-api-access-nzg9w") pod "ceeb6037-057a-47d1-94ea-44dbcedb3d1b" (UID: "ceeb6037-057a-47d1-94ea-44dbcedb3d1b"). InnerVolumeSpecName "kube-api-access-nzg9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.469259 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frfnk\" (UniqueName: \"kubernetes.io/projected/c4678683-6339-4d33-8297-76bfb9f58bb0-kube-api-access-frfnk\") pod \"dnsmasq-dns-775fbffbc7-qzpjj\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.555492 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzg9w\" (UniqueName: \"kubernetes.io/projected/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-kube-api-access-nzg9w\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.555536 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.555547 4845 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ceeb6037-057a-47d1-94ea-44dbcedb3d1b-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.599132 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.942911 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v4zd6-config-wz9m6" event={"ID":"ceeb6037-057a-47d1-94ea-44dbcedb3d1b","Type":"ContainerDied","Data":"7356b2c875262f16311860619254f9699777bfc8b387c96c2aaf9b642edaafec"} Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.943172 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7356b2c875262f16311860619254f9699777bfc8b387c96c2aaf9b642edaafec" Oct 06 07:00:37 crc kubenswrapper[4845]: I1006 07:00:37.943559 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v4zd6-config-wz9m6" Oct 06 07:00:38 crc kubenswrapper[4845]: I1006 07:00:38.067772 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-775fbffbc7-qzpjj"] Oct 06 07:00:38 crc kubenswrapper[4845]: W1006 07:00:38.074167 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4678683_6339_4d33_8297_76bfb9f58bb0.slice/crio-e23cf7e2eadd910d7d270f9b6f87ff1232bab8879307ba39d171a527cb8c3065 WatchSource:0}: Error finding container e23cf7e2eadd910d7d270f9b6f87ff1232bab8879307ba39d171a527cb8c3065: Status 404 returned error can't find the container with id e23cf7e2eadd910d7d270f9b6f87ff1232bab8879307ba39d171a527cb8c3065 Oct 06 07:00:38 crc kubenswrapper[4845]: E1006 07:00:38.115790 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podceeb6037_057a_47d1_94ea_44dbcedb3d1b.slice\": RecentStats: unable to find data in memory cache]" Oct 06 07:00:38 crc kubenswrapper[4845]: I1006 07:00:38.364586 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-v4zd6-config-wz9m6"] Oct 06 07:00:38 crc kubenswrapper[4845]: I1006 07:00:38.370606 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-v4zd6-config-wz9m6"] Oct 06 07:00:38 crc kubenswrapper[4845]: I1006 07:00:38.958162 4845 generic.go:334] "Generic (PLEG): container finished" podID="38d9a5cf-6de3-487c-a71c-374ca55ca525" containerID="7822a084cc8e1391a4f88232f9ad984b3d121bb9054ea9593f43c5d0942dc39c" exitCode=0 Oct 06 07:00:38 crc kubenswrapper[4845]: I1006 07:00:38.958254 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38d9a5cf-6de3-487c-a71c-374ca55ca525","Type":"ContainerDied","Data":"7822a084cc8e1391a4f88232f9ad984b3d121bb9054ea9593f43c5d0942dc39c"} Oct 06 07:00:38 crc kubenswrapper[4845]: I1006 07:00:38.960962 4845 generic.go:334] "Generic (PLEG): container finished" podID="c4678683-6339-4d33-8297-76bfb9f58bb0" containerID="99599fb217d525d43f6dfb0f189c3ed7e1987529165266beb1e7cafedfaaaf6c" exitCode=0 Oct 06 07:00:38 crc kubenswrapper[4845]: I1006 07:00:38.961018 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" event={"ID":"c4678683-6339-4d33-8297-76bfb9f58bb0","Type":"ContainerDied","Data":"99599fb217d525d43f6dfb0f189c3ed7e1987529165266beb1e7cafedfaaaf6c"} Oct 06 07:00:38 crc kubenswrapper[4845]: I1006 07:00:38.961043 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" event={"ID":"c4678683-6339-4d33-8297-76bfb9f58bb0","Type":"ContainerStarted","Data":"e23cf7e2eadd910d7d270f9b6f87ff1232bab8879307ba39d171a527cb8c3065"} Oct 06 07:00:39 crc kubenswrapper[4845]: I1006 07:00:39.971561 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" event={"ID":"c4678683-6339-4d33-8297-76bfb9f58bb0","Type":"ContainerStarted","Data":"9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d"} Oct 06 07:00:39 crc kubenswrapper[4845]: I1006 07:00:39.971943 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:39 crc kubenswrapper[4845]: I1006 07:00:39.974136 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38d9a5cf-6de3-487c-a71c-374ca55ca525","Type":"ContainerStarted","Data":"b03e7f50e171be756612b0954404b099795066c4f9a924a451b446a1704bb5e8"} Oct 06 07:00:39 crc kubenswrapper[4845]: I1006 07:00:39.974319 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:00:39 crc kubenswrapper[4845]: I1006 07:00:39.994859 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" podStartSLOduration=2.994838091 podStartE2EDuration="2.994838091s" podCreationTimestamp="2025-10-06 07:00:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:00:39.989924214 +0000 UTC m=+924.504665242" watchObservedRunningTime="2025-10-06 07:00:39.994838091 +0000 UTC m=+924.509579099" Oct 06 07:00:40 crc kubenswrapper[4845]: I1006 07:00:40.018270 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371960.836527 podStartE2EDuration="1m16.018249143s" podCreationTimestamp="2025-10-06 06:59:24 +0000 UTC" firstStartedPulling="2025-10-06 06:59:26.640974135 +0000 UTC m=+851.155715143" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:00:40.014691441 +0000 UTC m=+924.529432459" watchObservedRunningTime="2025-10-06 07:00:40.018249143 +0000 UTC m=+924.532990151" Oct 06 07:00:40 crc kubenswrapper[4845]: I1006 07:00:40.235571 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceeb6037-057a-47d1-94ea-44dbcedb3d1b" path="/var/lib/kubelet/pods/ceeb6037-057a-47d1-94ea-44dbcedb3d1b/volumes" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.092564 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.414423 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-vcfff"] Oct 06 07:00:46 crc kubenswrapper[4845]: E1006 07:00:46.416888 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeb6037-057a-47d1-94ea-44dbcedb3d1b" containerName="ovn-config" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.416929 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeb6037-057a-47d1-94ea-44dbcedb3d1b" containerName="ovn-config" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.417105 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceeb6037-057a-47d1-94ea-44dbcedb3d1b" containerName="ovn-config" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.417683 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vcfff" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.435765 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vcfff"] Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.517979 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvtvf\" (UniqueName: \"kubernetes.io/projected/6d918037-6a73-4a99-9e69-28495f63f3be-kube-api-access-zvtvf\") pod \"barbican-db-create-vcfff\" (UID: \"6d918037-6a73-4a99-9e69-28495f63f3be\") " pod="openstack/barbican-db-create-vcfff" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.623458 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvtvf\" (UniqueName: \"kubernetes.io/projected/6d918037-6a73-4a99-9e69-28495f63f3be-kube-api-access-zvtvf\") pod \"barbican-db-create-vcfff\" (UID: \"6d918037-6a73-4a99-9e69-28495f63f3be\") " pod="openstack/barbican-db-create-vcfff" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.632689 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-vwhfh"] Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.633677 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vwhfh" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.687322 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvtvf\" (UniqueName: \"kubernetes.io/projected/6d918037-6a73-4a99-9e69-28495f63f3be-kube-api-access-zvtvf\") pod \"barbican-db-create-vcfff\" (UID: \"6d918037-6a73-4a99-9e69-28495f63f3be\") " pod="openstack/barbican-db-create-vcfff" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.693529 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vwhfh"] Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.728331 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgrbr\" (UniqueName: \"kubernetes.io/projected/c59096b3-228f-421b-9e87-ad0dd90c1ab3-kube-api-access-bgrbr\") pod \"cinder-db-create-vwhfh\" (UID: \"c59096b3-228f-421b-9e87-ad0dd90c1ab3\") " pod="openstack/cinder-db-create-vwhfh" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.742766 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vcfff" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.814816 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-bzr5z"] Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.815913 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bzr5z" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.829017 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-bzr5z"] Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.830815 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgrbr\" (UniqueName: \"kubernetes.io/projected/c59096b3-228f-421b-9e87-ad0dd90c1ab3-kube-api-access-bgrbr\") pod \"cinder-db-create-vwhfh\" (UID: \"c59096b3-228f-421b-9e87-ad0dd90c1ab3\") " pod="openstack/cinder-db-create-vwhfh" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.859693 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgrbr\" (UniqueName: \"kubernetes.io/projected/c59096b3-228f-421b-9e87-ad0dd90c1ab3-kube-api-access-bgrbr\") pod \"cinder-db-create-vwhfh\" (UID: \"c59096b3-228f-421b-9e87-ad0dd90c1ab3\") " pod="openstack/cinder-db-create-vwhfh" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.886459 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jnr5f"] Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.890741 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.894465 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xw6wh" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.894618 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.894782 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.894933 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 07:00:46 crc kubenswrapper[4845]: I1006 07:00:46.898097 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jnr5f"] Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:46.932023 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgtrl\" (UniqueName: \"kubernetes.io/projected/a384bf06-7d5c-4189-88ca-67302613c968-kube-api-access-rgtrl\") pod \"neutron-db-create-bzr5z\" (UID: \"a384bf06-7d5c-4189-88ca-67302613c968\") " pod="openstack/neutron-db-create-bzr5z" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:46.932086 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wtsk\" (UniqueName: \"kubernetes.io/projected/29c86b89-d988-4e25-8019-77e7e266f785-kube-api-access-5wtsk\") pod \"keystone-db-sync-jnr5f\" (UID: \"29c86b89-d988-4e25-8019-77e7e266f785\") " pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:46.932169 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c86b89-d988-4e25-8019-77e7e266f785-combined-ca-bundle\") pod \"keystone-db-sync-jnr5f\" (UID: \"29c86b89-d988-4e25-8019-77e7e266f785\") " pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:46.932221 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c86b89-d988-4e25-8019-77e7e266f785-config-data\") pod \"keystone-db-sync-jnr5f\" (UID: \"29c86b89-d988-4e25-8019-77e7e266f785\") " pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.036257 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c86b89-d988-4e25-8019-77e7e266f785-combined-ca-bundle\") pod \"keystone-db-sync-jnr5f\" (UID: \"29c86b89-d988-4e25-8019-77e7e266f785\") " pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.036314 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c86b89-d988-4e25-8019-77e7e266f785-config-data\") pod \"keystone-db-sync-jnr5f\" (UID: \"29c86b89-d988-4e25-8019-77e7e266f785\") " pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.036506 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgtrl\" (UniqueName: \"kubernetes.io/projected/a384bf06-7d5c-4189-88ca-67302613c968-kube-api-access-rgtrl\") pod \"neutron-db-create-bzr5z\" (UID: \"a384bf06-7d5c-4189-88ca-67302613c968\") " pod="openstack/neutron-db-create-bzr5z" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.036578 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wtsk\" (UniqueName: \"kubernetes.io/projected/29c86b89-d988-4e25-8019-77e7e266f785-kube-api-access-5wtsk\") pod \"keystone-db-sync-jnr5f\" (UID: \"29c86b89-d988-4e25-8019-77e7e266f785\") " pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.043940 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c86b89-d988-4e25-8019-77e7e266f785-combined-ca-bundle\") pod \"keystone-db-sync-jnr5f\" (UID: \"29c86b89-d988-4e25-8019-77e7e266f785\") " pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.044582 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c86b89-d988-4e25-8019-77e7e266f785-config-data\") pod \"keystone-db-sync-jnr5f\" (UID: \"29c86b89-d988-4e25-8019-77e7e266f785\") " pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.059028 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgtrl\" (UniqueName: \"kubernetes.io/projected/a384bf06-7d5c-4189-88ca-67302613c968-kube-api-access-rgtrl\") pod \"neutron-db-create-bzr5z\" (UID: \"a384bf06-7d5c-4189-88ca-67302613c968\") " pod="openstack/neutron-db-create-bzr5z" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.061142 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wtsk\" (UniqueName: \"kubernetes.io/projected/29c86b89-d988-4e25-8019-77e7e266f785-kube-api-access-5wtsk\") pod \"keystone-db-sync-jnr5f\" (UID: \"29c86b89-d988-4e25-8019-77e7e266f785\") " pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.125513 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vwhfh" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.140679 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bzr5z" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.214440 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.600542 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.656802 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c7856c787-p69z9"] Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.657035 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c7856c787-p69z9" podUID="4fb1d4cd-3f24-42a8-87d3-888654b8ac01" containerName="dnsmasq-dns" containerID="cri-o://0680f345cdd85c5410198edfad14ddbfaf6ea7adc947019e44d90dad2e7213c0" gracePeriod=10 Oct 06 07:00:47 crc kubenswrapper[4845]: W1006 07:00:47.767923 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d918037_6a73_4a99_9e69_28495f63f3be.slice/crio-c5df72d8430c03ba67171e55189d99ea74179bf8a93c3c646bedba8703a84428 WatchSource:0}: Error finding container c5df72d8430c03ba67171e55189d99ea74179bf8a93c3c646bedba8703a84428: Status 404 returned error can't find the container with id c5df72d8430c03ba67171e55189d99ea74179bf8a93c3c646bedba8703a84428 Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.771111 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vcfff"] Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.828443 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vwhfh"] Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.832044 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-bzr5z"] Oct 06 07:00:47 crc kubenswrapper[4845]: W1006 07:00:47.836179 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc59096b3_228f_421b_9e87_ad0dd90c1ab3.slice/crio-6da54aea60f0bdf4b5d83156ca5c8185aea8ca3ebb19bf2fc47dc9e43213a4c1 WatchSource:0}: Error finding container 6da54aea60f0bdf4b5d83156ca5c8185aea8ca3ebb19bf2fc47dc9e43213a4c1: Status 404 returned error can't find the container with id 6da54aea60f0bdf4b5d83156ca5c8185aea8ca3ebb19bf2fc47dc9e43213a4c1 Oct 06 07:00:47 crc kubenswrapper[4845]: W1006 07:00:47.857681 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda384bf06_7d5c_4189_88ca_67302613c968.slice/crio-ffa4ebd38ea2acc70eae73e026a8deb36df18986c9d68ee1c2e2616be5d87fc5 WatchSource:0}: Error finding container ffa4ebd38ea2acc70eae73e026a8deb36df18986c9d68ee1c2e2616be5d87fc5: Status 404 returned error can't find the container with id ffa4ebd38ea2acc70eae73e026a8deb36df18986c9d68ee1c2e2616be5d87fc5 Oct 06 07:00:47 crc kubenswrapper[4845]: I1006 07:00:47.863898 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jnr5f"] Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.052934 4845 generic.go:334] "Generic (PLEG): container finished" podID="4fb1d4cd-3f24-42a8-87d3-888654b8ac01" containerID="0680f345cdd85c5410198edfad14ddbfaf6ea7adc947019e44d90dad2e7213c0" exitCode=0 Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.053111 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7856c787-p69z9" event={"ID":"4fb1d4cd-3f24-42a8-87d3-888654b8ac01","Type":"ContainerDied","Data":"0680f345cdd85c5410198edfad14ddbfaf6ea7adc947019e44d90dad2e7213c0"} Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.057520 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vcfff" event={"ID":"6d918037-6a73-4a99-9e69-28495f63f3be","Type":"ContainerStarted","Data":"c5df72d8430c03ba67171e55189d99ea74179bf8a93c3c646bedba8703a84428"} Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.059578 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qnzpb" event={"ID":"ef9aaef1-2966-4818-9f26-87bcf275907d","Type":"ContainerStarted","Data":"929e16a909839f0532543c2b025fe4a732387ac9475baeb28d68bf0d835ca67f"} Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.061173 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bzr5z" event={"ID":"a384bf06-7d5c-4189-88ca-67302613c968","Type":"ContainerStarted","Data":"ffa4ebd38ea2acc70eae73e026a8deb36df18986c9d68ee1c2e2616be5d87fc5"} Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.063467 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vwhfh" event={"ID":"c59096b3-228f-421b-9e87-ad0dd90c1ab3","Type":"ContainerStarted","Data":"6da54aea60f0bdf4b5d83156ca5c8185aea8ca3ebb19bf2fc47dc9e43213a4c1"} Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.069712 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jnr5f" event={"ID":"29c86b89-d988-4e25-8019-77e7e266f785","Type":"ContainerStarted","Data":"72805a4cc6194f1a6f69d8aea3b1e64c46161fe948f4d30733212c9facb65edc"} Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.081605 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-qnzpb" podStartSLOduration=3.192286389 podStartE2EDuration="19.081588148s" podCreationTimestamp="2025-10-06 07:00:29 +0000 UTC" firstStartedPulling="2025-10-06 07:00:30.682658263 +0000 UTC m=+915.197399271" lastFinishedPulling="2025-10-06 07:00:46.571960022 +0000 UTC m=+931.086701030" observedRunningTime="2025-10-06 07:00:48.077266157 +0000 UTC m=+932.592007185" watchObservedRunningTime="2025-10-06 07:00:48.081588148 +0000 UTC m=+932.596329156" Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.085215 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.158660 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-ovsdbserver-sb\") pod \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.158704 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-dns-svc\") pod \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.158795 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-ovsdbserver-nb\") pod \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.158824 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dx4q\" (UniqueName: \"kubernetes.io/projected/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-kube-api-access-9dx4q\") pod \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.158869 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-config\") pod \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\" (UID: \"4fb1d4cd-3f24-42a8-87d3-888654b8ac01\") " Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.170562 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-kube-api-access-9dx4q" (OuterVolumeSpecName: "kube-api-access-9dx4q") pod "4fb1d4cd-3f24-42a8-87d3-888654b8ac01" (UID: "4fb1d4cd-3f24-42a8-87d3-888654b8ac01"). InnerVolumeSpecName "kube-api-access-9dx4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.211439 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4fb1d4cd-3f24-42a8-87d3-888654b8ac01" (UID: "4fb1d4cd-3f24-42a8-87d3-888654b8ac01"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.220641 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4fb1d4cd-3f24-42a8-87d3-888654b8ac01" (UID: "4fb1d4cd-3f24-42a8-87d3-888654b8ac01"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.226933 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4fb1d4cd-3f24-42a8-87d3-888654b8ac01" (UID: "4fb1d4cd-3f24-42a8-87d3-888654b8ac01"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.235911 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-config" (OuterVolumeSpecName: "config") pod "4fb1d4cd-3f24-42a8-87d3-888654b8ac01" (UID: "4fb1d4cd-3f24-42a8-87d3-888654b8ac01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.260929 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.260958 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dx4q\" (UniqueName: \"kubernetes.io/projected/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-kube-api-access-9dx4q\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.260971 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.260979 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:48 crc kubenswrapper[4845]: I1006 07:00:48.260987 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fb1d4cd-3f24-42a8-87d3-888654b8ac01-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:48 crc kubenswrapper[4845]: E1006 07:00:48.400920 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc59096b3_228f_421b_9e87_ad0dd90c1ab3.slice/crio-conmon-0ef35a947d4bfaa4d135fab464751529dde2deaf8189b41ca5b6542a7b750237.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d918037_6a73_4a99_9e69_28495f63f3be.slice/crio-c29c806e8664e06bfdf7e599cb473d52132b1a59dca59532453e6145130f7d3c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc59096b3_228f_421b_9e87_ad0dd90c1ab3.slice/crio-0ef35a947d4bfaa4d135fab464751529dde2deaf8189b41ca5b6542a7b750237.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fb1d4cd_3f24_42a8_87d3_888654b8ac01.slice/crio-45b5820ae10fd18064c5dbbcd87f3ede0f63ff51ddb08af21c20638538be6b94\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d918037_6a73_4a99_9e69_28495f63f3be.slice/crio-conmon-c29c806e8664e06bfdf7e599cb473d52132b1a59dca59532453e6145130f7d3c.scope\": RecentStats: unable to find data in memory cache]" Oct 06 07:00:49 crc kubenswrapper[4845]: I1006 07:00:49.082277 4845 generic.go:334] "Generic (PLEG): container finished" podID="a384bf06-7d5c-4189-88ca-67302613c968" containerID="3d93a939cd1a83b322fefc8f7c6cb78e918d284d9f6c1176b1f4d748d802e7d6" exitCode=0 Oct 06 07:00:49 crc kubenswrapper[4845]: I1006 07:00:49.082639 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bzr5z" event={"ID":"a384bf06-7d5c-4189-88ca-67302613c968","Type":"ContainerDied","Data":"3d93a939cd1a83b322fefc8f7c6cb78e918d284d9f6c1176b1f4d748d802e7d6"} Oct 06 07:00:49 crc kubenswrapper[4845]: I1006 07:00:49.084050 4845 generic.go:334] "Generic (PLEG): container finished" podID="c59096b3-228f-421b-9e87-ad0dd90c1ab3" containerID="0ef35a947d4bfaa4d135fab464751529dde2deaf8189b41ca5b6542a7b750237" exitCode=0 Oct 06 07:00:49 crc kubenswrapper[4845]: I1006 07:00:49.084089 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vwhfh" event={"ID":"c59096b3-228f-421b-9e87-ad0dd90c1ab3","Type":"ContainerDied","Data":"0ef35a947d4bfaa4d135fab464751529dde2deaf8189b41ca5b6542a7b750237"} Oct 06 07:00:49 crc kubenswrapper[4845]: I1006 07:00:49.089584 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7856c787-p69z9" Oct 06 07:00:49 crc kubenswrapper[4845]: I1006 07:00:49.089604 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7856c787-p69z9" event={"ID":"4fb1d4cd-3f24-42a8-87d3-888654b8ac01","Type":"ContainerDied","Data":"45b5820ae10fd18064c5dbbcd87f3ede0f63ff51ddb08af21c20638538be6b94"} Oct 06 07:00:49 crc kubenswrapper[4845]: I1006 07:00:49.089706 4845 scope.go:117] "RemoveContainer" containerID="0680f345cdd85c5410198edfad14ddbfaf6ea7adc947019e44d90dad2e7213c0" Oct 06 07:00:49 crc kubenswrapper[4845]: I1006 07:00:49.092121 4845 generic.go:334] "Generic (PLEG): container finished" podID="6d918037-6a73-4a99-9e69-28495f63f3be" containerID="c29c806e8664e06bfdf7e599cb473d52132b1a59dca59532453e6145130f7d3c" exitCode=0 Oct 06 07:00:49 crc kubenswrapper[4845]: I1006 07:00:49.092160 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vcfff" event={"ID":"6d918037-6a73-4a99-9e69-28495f63f3be","Type":"ContainerDied","Data":"c29c806e8664e06bfdf7e599cb473d52132b1a59dca59532453e6145130f7d3c"} Oct 06 07:00:49 crc kubenswrapper[4845]: I1006 07:00:49.123867 4845 scope.go:117] "RemoveContainer" containerID="a3568b386c28d0fea94b185fc942cf040c3a2c15a8bdf5ac3d67e2c39d10cc5e" Oct 06 07:00:49 crc kubenswrapper[4845]: I1006 07:00:49.141958 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c7856c787-p69z9"] Oct 06 07:00:49 crc kubenswrapper[4845]: I1006 07:00:49.163548 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c7856c787-p69z9"] Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.249822 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb1d4cd-3f24-42a8-87d3-888654b8ac01" path="/var/lib/kubelet/pods/4fb1d4cd-3f24-42a8-87d3-888654b8ac01/volumes" Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.619098 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vcfff" Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.626141 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vwhfh" Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.630503 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bzr5z" Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.727469 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvtvf\" (UniqueName: \"kubernetes.io/projected/6d918037-6a73-4a99-9e69-28495f63f3be-kube-api-access-zvtvf\") pod \"6d918037-6a73-4a99-9e69-28495f63f3be\" (UID: \"6d918037-6a73-4a99-9e69-28495f63f3be\") " Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.727546 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgtrl\" (UniqueName: \"kubernetes.io/projected/a384bf06-7d5c-4189-88ca-67302613c968-kube-api-access-rgtrl\") pod \"a384bf06-7d5c-4189-88ca-67302613c968\" (UID: \"a384bf06-7d5c-4189-88ca-67302613c968\") " Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.727700 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgrbr\" (UniqueName: \"kubernetes.io/projected/c59096b3-228f-421b-9e87-ad0dd90c1ab3-kube-api-access-bgrbr\") pod \"c59096b3-228f-421b-9e87-ad0dd90c1ab3\" (UID: \"c59096b3-228f-421b-9e87-ad0dd90c1ab3\") " Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.734466 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a384bf06-7d5c-4189-88ca-67302613c968-kube-api-access-rgtrl" (OuterVolumeSpecName: "kube-api-access-rgtrl") pod "a384bf06-7d5c-4189-88ca-67302613c968" (UID: "a384bf06-7d5c-4189-88ca-67302613c968"). InnerVolumeSpecName "kube-api-access-rgtrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.734623 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c59096b3-228f-421b-9e87-ad0dd90c1ab3-kube-api-access-bgrbr" (OuterVolumeSpecName: "kube-api-access-bgrbr") pod "c59096b3-228f-421b-9e87-ad0dd90c1ab3" (UID: "c59096b3-228f-421b-9e87-ad0dd90c1ab3"). InnerVolumeSpecName "kube-api-access-bgrbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.734757 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d918037-6a73-4a99-9e69-28495f63f3be-kube-api-access-zvtvf" (OuterVolumeSpecName: "kube-api-access-zvtvf") pod "6d918037-6a73-4a99-9e69-28495f63f3be" (UID: "6d918037-6a73-4a99-9e69-28495f63f3be"). InnerVolumeSpecName "kube-api-access-zvtvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.833200 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvtvf\" (UniqueName: \"kubernetes.io/projected/6d918037-6a73-4a99-9e69-28495f63f3be-kube-api-access-zvtvf\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.833241 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgtrl\" (UniqueName: \"kubernetes.io/projected/a384bf06-7d5c-4189-88ca-67302613c968-kube-api-access-rgtrl\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:50 crc kubenswrapper[4845]: I1006 07:00:50.833251 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgrbr\" (UniqueName: \"kubernetes.io/projected/c59096b3-228f-421b-9e87-ad0dd90c1ab3-kube-api-access-bgrbr\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:51 crc kubenswrapper[4845]: I1006 07:00:51.116131 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vcfff" event={"ID":"6d918037-6a73-4a99-9e69-28495f63f3be","Type":"ContainerDied","Data":"c5df72d8430c03ba67171e55189d99ea74179bf8a93c3c646bedba8703a84428"} Oct 06 07:00:51 crc kubenswrapper[4845]: I1006 07:00:51.116171 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vcfff" Oct 06 07:00:51 crc kubenswrapper[4845]: I1006 07:00:51.116184 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5df72d8430c03ba67171e55189d99ea74179bf8a93c3c646bedba8703a84428" Oct 06 07:00:51 crc kubenswrapper[4845]: I1006 07:00:51.117695 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bzr5z" Oct 06 07:00:51 crc kubenswrapper[4845]: I1006 07:00:51.117700 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bzr5z" event={"ID":"a384bf06-7d5c-4189-88ca-67302613c968","Type":"ContainerDied","Data":"ffa4ebd38ea2acc70eae73e026a8deb36df18986c9d68ee1c2e2616be5d87fc5"} Oct 06 07:00:51 crc kubenswrapper[4845]: I1006 07:00:51.117737 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffa4ebd38ea2acc70eae73e026a8deb36df18986c9d68ee1c2e2616be5d87fc5" Oct 06 07:00:51 crc kubenswrapper[4845]: I1006 07:00:51.119253 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vwhfh" event={"ID":"c59096b3-228f-421b-9e87-ad0dd90c1ab3","Type":"ContainerDied","Data":"6da54aea60f0bdf4b5d83156ca5c8185aea8ca3ebb19bf2fc47dc9e43213a4c1"} Oct 06 07:00:51 crc kubenswrapper[4845]: I1006 07:00:51.119276 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6da54aea60f0bdf4b5d83156ca5c8185aea8ca3ebb19bf2fc47dc9e43213a4c1" Oct 06 07:00:51 crc kubenswrapper[4845]: I1006 07:00:51.119539 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vwhfh" Oct 06 07:00:53 crc kubenswrapper[4845]: I1006 07:00:53.019318 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:00:53 crc kubenswrapper[4845]: I1006 07:00:53.019968 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:00:53 crc kubenswrapper[4845]: I1006 07:00:53.020019 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 07:00:53 crc kubenswrapper[4845]: I1006 07:00:53.020959 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f1a1b8d6a136dbd6653eb7b5058c9c79831c66bac509d23e8f8977bad3f0b842"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 07:00:53 crc kubenswrapper[4845]: I1006 07:00:53.021050 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://f1a1b8d6a136dbd6653eb7b5058c9c79831c66bac509d23e8f8977bad3f0b842" gracePeriod=600 Oct 06 07:00:54 crc kubenswrapper[4845]: I1006 07:00:54.144604 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="f1a1b8d6a136dbd6653eb7b5058c9c79831c66bac509d23e8f8977bad3f0b842" exitCode=0 Oct 06 07:00:54 crc kubenswrapper[4845]: I1006 07:00:54.144902 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"f1a1b8d6a136dbd6653eb7b5058c9c79831c66bac509d23e8f8977bad3f0b842"} Oct 06 07:00:54 crc kubenswrapper[4845]: I1006 07:00:54.144935 4845 scope.go:117] "RemoveContainer" containerID="443ce9eb4600adb1d78ff49056676775ceac0472627215eb275bc31a89a7c3b8" Oct 06 07:00:55 crc kubenswrapper[4845]: I1006 07:00:55.155157 4845 generic.go:334] "Generic (PLEG): container finished" podID="ef9aaef1-2966-4818-9f26-87bcf275907d" containerID="929e16a909839f0532543c2b025fe4a732387ac9475baeb28d68bf0d835ca67f" exitCode=0 Oct 06 07:00:55 crc kubenswrapper[4845]: I1006 07:00:55.155211 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qnzpb" event={"ID":"ef9aaef1-2966-4818-9f26-87bcf275907d","Type":"ContainerDied","Data":"929e16a909839f0532543c2b025fe4a732387ac9475baeb28d68bf0d835ca67f"} Oct 06 07:00:55 crc kubenswrapper[4845]: I1006 07:00:55.736312 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.166262 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"6a4030ef2b48fb5db1ac392bd2dae2cd42e97737e449c0b7f5beb300ab99f64c"} Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.167862 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jnr5f" event={"ID":"29c86b89-d988-4e25-8019-77e7e266f785","Type":"ContainerStarted","Data":"d6aa494a7c2fda78738b6448afae169d3ca3305dcbc62f44c57a3fbab10089b4"} Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.211016 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jnr5f" podStartSLOduration=2.46710203 podStartE2EDuration="10.211000114s" podCreationTimestamp="2025-10-06 07:00:46 +0000 UTC" firstStartedPulling="2025-10-06 07:00:47.904265629 +0000 UTC m=+932.419006637" lastFinishedPulling="2025-10-06 07:00:55.648163703 +0000 UTC m=+940.162904721" observedRunningTime="2025-10-06 07:00:56.205342229 +0000 UTC m=+940.720083257" watchObservedRunningTime="2025-10-06 07:00:56.211000114 +0000 UTC m=+940.725741122" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.607478 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.732146 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-db-sync-config-data\") pod \"ef9aaef1-2966-4818-9f26-87bcf275907d\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.732266 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh6tl\" (UniqueName: \"kubernetes.io/projected/ef9aaef1-2966-4818-9f26-87bcf275907d-kube-api-access-vh6tl\") pod \"ef9aaef1-2966-4818-9f26-87bcf275907d\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.732336 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-combined-ca-bundle\") pod \"ef9aaef1-2966-4818-9f26-87bcf275907d\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.732437 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-config-data\") pod \"ef9aaef1-2966-4818-9f26-87bcf275907d\" (UID: \"ef9aaef1-2966-4818-9f26-87bcf275907d\") " Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.751604 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef9aaef1-2966-4818-9f26-87bcf275907d-kube-api-access-vh6tl" (OuterVolumeSpecName: "kube-api-access-vh6tl") pod "ef9aaef1-2966-4818-9f26-87bcf275907d" (UID: "ef9aaef1-2966-4818-9f26-87bcf275907d"). InnerVolumeSpecName "kube-api-access-vh6tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.792705 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ef9aaef1-2966-4818-9f26-87bcf275907d" (UID: "ef9aaef1-2966-4818-9f26-87bcf275907d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.841678 4845 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.841708 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh6tl\" (UniqueName: \"kubernetes.io/projected/ef9aaef1-2966-4818-9f26-87bcf275907d-kube-api-access-vh6tl\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.845482 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-da3d-account-create-cw7vt"] Oct 06 07:00:56 crc kubenswrapper[4845]: E1006 07:00:56.846037 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a384bf06-7d5c-4189-88ca-67302613c968" containerName="mariadb-database-create" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.846057 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a384bf06-7d5c-4189-88ca-67302613c968" containerName="mariadb-database-create" Oct 06 07:00:56 crc kubenswrapper[4845]: E1006 07:00:56.846076 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb1d4cd-3f24-42a8-87d3-888654b8ac01" containerName="dnsmasq-dns" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.846084 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb1d4cd-3f24-42a8-87d3-888654b8ac01" containerName="dnsmasq-dns" Oct 06 07:00:56 crc kubenswrapper[4845]: E1006 07:00:56.846102 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c59096b3-228f-421b-9e87-ad0dd90c1ab3" containerName="mariadb-database-create" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.846109 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c59096b3-228f-421b-9e87-ad0dd90c1ab3" containerName="mariadb-database-create" Oct 06 07:00:56 crc kubenswrapper[4845]: E1006 07:00:56.846125 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d918037-6a73-4a99-9e69-28495f63f3be" containerName="mariadb-database-create" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.846131 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d918037-6a73-4a99-9e69-28495f63f3be" containerName="mariadb-database-create" Oct 06 07:00:56 crc kubenswrapper[4845]: E1006 07:00:56.846142 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9aaef1-2966-4818-9f26-87bcf275907d" containerName="glance-db-sync" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.846148 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9aaef1-2966-4818-9f26-87bcf275907d" containerName="glance-db-sync" Oct 06 07:00:56 crc kubenswrapper[4845]: E1006 07:00:56.846157 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb1d4cd-3f24-42a8-87d3-888654b8ac01" containerName="init" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.846163 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb1d4cd-3f24-42a8-87d3-888654b8ac01" containerName="init" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.846623 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c59096b3-228f-421b-9e87-ad0dd90c1ab3" containerName="mariadb-database-create" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.846639 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9aaef1-2966-4818-9f26-87bcf275907d" containerName="glance-db-sync" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.846649 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a384bf06-7d5c-4189-88ca-67302613c968" containerName="mariadb-database-create" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.846681 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d918037-6a73-4a99-9e69-28495f63f3be" containerName="mariadb-database-create" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.846695 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb1d4cd-3f24-42a8-87d3-888654b8ac01" containerName="dnsmasq-dns" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.847555 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da3d-account-create-cw7vt" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.855280 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.855523 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef9aaef1-2966-4818-9f26-87bcf275907d" (UID: "ef9aaef1-2966-4818-9f26-87bcf275907d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.868576 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-da3d-account-create-cw7vt"] Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.878465 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-config-data" (OuterVolumeSpecName: "config-data") pod "ef9aaef1-2966-4818-9f26-87bcf275907d" (UID: "ef9aaef1-2966-4818-9f26-87bcf275907d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.943173 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8xc2\" (UniqueName: \"kubernetes.io/projected/c0b56390-32a4-4314-b26f-9e818676dde8-kube-api-access-s8xc2\") pod \"neutron-da3d-account-create-cw7vt\" (UID: \"c0b56390-32a4-4314-b26f-9e818676dde8\") " pod="openstack/neutron-da3d-account-create-cw7vt" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.943331 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:56 crc kubenswrapper[4845]: I1006 07:00:56.943349 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9aaef1-2966-4818-9f26-87bcf275907d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.044962 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8xc2\" (UniqueName: \"kubernetes.io/projected/c0b56390-32a4-4314-b26f-9e818676dde8-kube-api-access-s8xc2\") pod \"neutron-da3d-account-create-cw7vt\" (UID: \"c0b56390-32a4-4314-b26f-9e818676dde8\") " pod="openstack/neutron-da3d-account-create-cw7vt" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.061110 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8xc2\" (UniqueName: \"kubernetes.io/projected/c0b56390-32a4-4314-b26f-9e818676dde8-kube-api-access-s8xc2\") pod \"neutron-da3d-account-create-cw7vt\" (UID: \"c0b56390-32a4-4314-b26f-9e818676dde8\") " pod="openstack/neutron-da3d-account-create-cw7vt" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.175495 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qnzpb" event={"ID":"ef9aaef1-2966-4818-9f26-87bcf275907d","Type":"ContainerDied","Data":"c2360890984fb8d90bf76db37bd39aa38acd23db76d060c744a05497ba63a061"} Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.175543 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qnzpb" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.175557 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2360890984fb8d90bf76db37bd39aa38acd23db76d060c744a05497ba63a061" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.203023 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da3d-account-create-cw7vt" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.657853 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-da3d-account-create-cw7vt"] Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.680659 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d6456d757-m878m"] Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.692807 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.695682 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d6456d757-m878m"] Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.758120 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-config\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.758175 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-dns-svc\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.758218 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-dns-swift-storage-0\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.758266 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr4wr\" (UniqueName: \"kubernetes.io/projected/b6c41f21-8adb-4524-9e06-d3591c6984cc-kube-api-access-vr4wr\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.758283 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-ovsdbserver-nb\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.758305 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-ovsdbserver-sb\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.861337 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-config\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.860041 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-config\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.862121 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-dns-svc\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.862150 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-dns-svc\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.862226 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-dns-swift-storage-0\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.862877 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-dns-swift-storage-0\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.862990 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr4wr\" (UniqueName: \"kubernetes.io/projected/b6c41f21-8adb-4524-9e06-d3591c6984cc-kube-api-access-vr4wr\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.863014 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-ovsdbserver-nb\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.863296 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-ovsdbserver-sb\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.863930 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-ovsdbserver-sb\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.864057 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-ovsdbserver-nb\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:57 crc kubenswrapper[4845]: I1006 07:00:57.882174 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr4wr\" (UniqueName: \"kubernetes.io/projected/b6c41f21-8adb-4524-9e06-d3591c6984cc-kube-api-access-vr4wr\") pod \"dnsmasq-dns-6d6456d757-m878m\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:58 crc kubenswrapper[4845]: I1006 07:00:58.029667 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:00:58 crc kubenswrapper[4845]: I1006 07:00:58.192975 4845 generic.go:334] "Generic (PLEG): container finished" podID="c0b56390-32a4-4314-b26f-9e818676dde8" containerID="2ff3bb2409dda5652b90cbec61798c6873cb910c22f2a41bc90e55601c3df11d" exitCode=0 Oct 06 07:00:58 crc kubenswrapper[4845]: I1006 07:00:58.193031 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da3d-account-create-cw7vt" event={"ID":"c0b56390-32a4-4314-b26f-9e818676dde8","Type":"ContainerDied","Data":"2ff3bb2409dda5652b90cbec61798c6873cb910c22f2a41bc90e55601c3df11d"} Oct 06 07:00:58 crc kubenswrapper[4845]: I1006 07:00:58.193073 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da3d-account-create-cw7vt" event={"ID":"c0b56390-32a4-4314-b26f-9e818676dde8","Type":"ContainerStarted","Data":"2da2d554c9a9702185fb04a574e23cc3fed3c8a5463a1642485a00c167508dc7"} Oct 06 07:00:58 crc kubenswrapper[4845]: I1006 07:00:58.253955 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d6456d757-m878m"] Oct 06 07:00:58 crc kubenswrapper[4845]: E1006 07:00:58.612738 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6c41f21_8adb_4524_9e06_d3591c6984cc.slice/crio-conmon-080012ea237e71498d9f671c1c74bfd97239e568b3d8336f9c381044db32a910.scope\": RecentStats: unable to find data in memory cache]" Oct 06 07:00:59 crc kubenswrapper[4845]: I1006 07:00:59.206144 4845 generic.go:334] "Generic (PLEG): container finished" podID="29c86b89-d988-4e25-8019-77e7e266f785" containerID="d6aa494a7c2fda78738b6448afae169d3ca3305dcbc62f44c57a3fbab10089b4" exitCode=0 Oct 06 07:00:59 crc kubenswrapper[4845]: I1006 07:00:59.206237 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jnr5f" event={"ID":"29c86b89-d988-4e25-8019-77e7e266f785","Type":"ContainerDied","Data":"d6aa494a7c2fda78738b6448afae169d3ca3305dcbc62f44c57a3fbab10089b4"} Oct 06 07:00:59 crc kubenswrapper[4845]: I1006 07:00:59.208189 4845 generic.go:334] "Generic (PLEG): container finished" podID="b6c41f21-8adb-4524-9e06-d3591c6984cc" containerID="080012ea237e71498d9f671c1c74bfd97239e568b3d8336f9c381044db32a910" exitCode=0 Oct 06 07:00:59 crc kubenswrapper[4845]: I1006 07:00:59.209265 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d6456d757-m878m" event={"ID":"b6c41f21-8adb-4524-9e06-d3591c6984cc","Type":"ContainerDied","Data":"080012ea237e71498d9f671c1c74bfd97239e568b3d8336f9c381044db32a910"} Oct 06 07:00:59 crc kubenswrapper[4845]: I1006 07:00:59.209347 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d6456d757-m878m" event={"ID":"b6c41f21-8adb-4524-9e06-d3591c6984cc","Type":"ContainerStarted","Data":"7dba3765e652326c9077a460b97e1352d1686099ca987c7f15c8f653b02826e2"} Oct 06 07:00:59 crc kubenswrapper[4845]: I1006 07:00:59.701346 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da3d-account-create-cw7vt" Oct 06 07:00:59 crc kubenswrapper[4845]: I1006 07:00:59.811258 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8xc2\" (UniqueName: \"kubernetes.io/projected/c0b56390-32a4-4314-b26f-9e818676dde8-kube-api-access-s8xc2\") pod \"c0b56390-32a4-4314-b26f-9e818676dde8\" (UID: \"c0b56390-32a4-4314-b26f-9e818676dde8\") " Oct 06 07:00:59 crc kubenswrapper[4845]: I1006 07:00:59.815606 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b56390-32a4-4314-b26f-9e818676dde8-kube-api-access-s8xc2" (OuterVolumeSpecName: "kube-api-access-s8xc2") pod "c0b56390-32a4-4314-b26f-9e818676dde8" (UID: "c0b56390-32a4-4314-b26f-9e818676dde8"). InnerVolumeSpecName "kube-api-access-s8xc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:00:59 crc kubenswrapper[4845]: I1006 07:00:59.913573 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8xc2\" (UniqueName: \"kubernetes.io/projected/c0b56390-32a4-4314-b26f-9e818676dde8-kube-api-access-s8xc2\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.222408 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d6456d757-m878m" event={"ID":"b6c41f21-8adb-4524-9e06-d3591c6984cc","Type":"ContainerStarted","Data":"a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867"} Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.222856 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.225618 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da3d-account-create-cw7vt" Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.225624 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da3d-account-create-cw7vt" event={"ID":"c0b56390-32a4-4314-b26f-9e818676dde8","Type":"ContainerDied","Data":"2da2d554c9a9702185fb04a574e23cc3fed3c8a5463a1642485a00c167508dc7"} Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.225672 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2da2d554c9a9702185fb04a574e23cc3fed3c8a5463a1642485a00c167508dc7" Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.249766 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d6456d757-m878m" podStartSLOduration=3.249742729 podStartE2EDuration="3.249742729s" podCreationTimestamp="2025-10-06 07:00:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:00.248468666 +0000 UTC m=+944.763209674" watchObservedRunningTime="2025-10-06 07:01:00.249742729 +0000 UTC m=+944.764483747" Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.512100 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.623165 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c86b89-d988-4e25-8019-77e7e266f785-combined-ca-bundle\") pod \"29c86b89-d988-4e25-8019-77e7e266f785\" (UID: \"29c86b89-d988-4e25-8019-77e7e266f785\") " Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.623518 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c86b89-d988-4e25-8019-77e7e266f785-config-data\") pod \"29c86b89-d988-4e25-8019-77e7e266f785\" (UID: \"29c86b89-d988-4e25-8019-77e7e266f785\") " Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.623586 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wtsk\" (UniqueName: \"kubernetes.io/projected/29c86b89-d988-4e25-8019-77e7e266f785-kube-api-access-5wtsk\") pod \"29c86b89-d988-4e25-8019-77e7e266f785\" (UID: \"29c86b89-d988-4e25-8019-77e7e266f785\") " Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.628074 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c86b89-d988-4e25-8019-77e7e266f785-kube-api-access-5wtsk" (OuterVolumeSpecName: "kube-api-access-5wtsk") pod "29c86b89-d988-4e25-8019-77e7e266f785" (UID: "29c86b89-d988-4e25-8019-77e7e266f785"). InnerVolumeSpecName "kube-api-access-5wtsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.645327 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c86b89-d988-4e25-8019-77e7e266f785-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29c86b89-d988-4e25-8019-77e7e266f785" (UID: "29c86b89-d988-4e25-8019-77e7e266f785"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.662541 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c86b89-d988-4e25-8019-77e7e266f785-config-data" (OuterVolumeSpecName: "config-data") pod "29c86b89-d988-4e25-8019-77e7e266f785" (UID: "29c86b89-d988-4e25-8019-77e7e266f785"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.725781 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wtsk\" (UniqueName: \"kubernetes.io/projected/29c86b89-d988-4e25-8019-77e7e266f785-kube-api-access-5wtsk\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.725819 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c86b89-d988-4e25-8019-77e7e266f785-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:00 crc kubenswrapper[4845]: I1006 07:01:00.725838 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c86b89-d988-4e25-8019-77e7e266f785-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.235682 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jnr5f" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.235677 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jnr5f" event={"ID":"29c86b89-d988-4e25-8019-77e7e266f785","Type":"ContainerDied","Data":"72805a4cc6194f1a6f69d8aea3b1e64c46161fe948f4d30733212c9facb65edc"} Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.235843 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72805a4cc6194f1a6f69d8aea3b1e64c46161fe948f4d30733212c9facb65edc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.509056 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-w5zsc"] Oct 06 07:01:01 crc kubenswrapper[4845]: E1006 07:01:01.509444 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b56390-32a4-4314-b26f-9e818676dde8" containerName="mariadb-account-create" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.509460 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b56390-32a4-4314-b26f-9e818676dde8" containerName="mariadb-account-create" Oct 06 07:01:01 crc kubenswrapper[4845]: E1006 07:01:01.509479 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c86b89-d988-4e25-8019-77e7e266f785" containerName="keystone-db-sync" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.509488 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c86b89-d988-4e25-8019-77e7e266f785" containerName="keystone-db-sync" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.509677 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0b56390-32a4-4314-b26f-9e818676dde8" containerName="mariadb-account-create" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.509699 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="29c86b89-d988-4e25-8019-77e7e266f785" containerName="keystone-db-sync" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.515542 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.520404 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d6456d757-m878m"] Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.523314 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.523687 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.524433 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.524655 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xw6wh" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.538472 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w5zsc"] Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.550108 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57b56d5b9f-w6pjw"] Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.551817 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.591588 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b56d5b9f-w6pjw"] Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.640573 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-config-data\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.640646 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk4xs\" (UniqueName: \"kubernetes.io/projected/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-kube-api-access-pk4xs\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.640675 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-credential-keys\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.640692 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-combined-ca-bundle\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.640737 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-fernet-keys\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.640765 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-scripts\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.723298 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.725523 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.730639 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.730837 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.742714 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-scripts\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.742813 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-dns-swift-storage-0\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.742853 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-config-data\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.742901 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-ovsdbserver-sb\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.742938 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-config\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.742971 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk4xs\" (UniqueName: \"kubernetes.io/projected/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-kube-api-access-pk4xs\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.742999 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-credential-keys\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.743026 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-combined-ca-bundle\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.743056 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-ovsdbserver-nb\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.743086 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-dns-svc\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.743117 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6942\" (UniqueName: \"kubernetes.io/projected/2728015d-b72b-4da7-8de1-ff9afba21b1d-kube-api-access-r6942\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.743162 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-fernet-keys\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.764046 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-scripts\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.764417 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-fernet-keys\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.765092 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-config-data\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.767959 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.777917 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-credential-keys\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.787139 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk4xs\" (UniqueName: \"kubernetes.io/projected/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-kube-api-access-pk4xs\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.793856 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-combined-ca-bundle\") pod \"keystone-bootstrap-w5zsc\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.837971 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844498 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-ovsdbserver-sb\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844564 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-config\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844620 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-scripts\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844648 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844694 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-ovsdbserver-nb\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844716 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-dns-svc\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844739 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6942\" (UniqueName: \"kubernetes.io/projected/2728015d-b72b-4da7-8de1-ff9afba21b1d-kube-api-access-r6942\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844773 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01f8105a-29c8-4fef-806a-e1ed266242bf-run-httpd\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844878 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01f8105a-29c8-4fef-806a-e1ed266242bf-log-httpd\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844912 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk9qf\" (UniqueName: \"kubernetes.io/projected/01f8105a-29c8-4fef-806a-e1ed266242bf-kube-api-access-gk9qf\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844938 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844954 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-config-data\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.844999 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-dns-swift-storage-0\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.845398 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-ovsdbserver-sb\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.845530 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-config\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.846048 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-dns-swift-storage-0\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.846127 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-dns-svc\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.846143 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-ovsdbserver-nb\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.881164 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6942\" (UniqueName: \"kubernetes.io/projected/2728015d-b72b-4da7-8de1-ff9afba21b1d-kube-api-access-r6942\") pod \"dnsmasq-dns-57b56d5b9f-w6pjw\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.947028 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-scripts\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.947092 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.947131 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01f8105a-29c8-4fef-806a-e1ed266242bf-run-httpd\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.947197 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01f8105a-29c8-4fef-806a-e1ed266242bf-log-httpd\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.947214 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk9qf\" (UniqueName: \"kubernetes.io/projected/01f8105a-29c8-4fef-806a-e1ed266242bf-kube-api-access-gk9qf\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.947249 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.947264 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-config-data\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.959890 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01f8105a-29c8-4fef-806a-e1ed266242bf-log-httpd\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.963894 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.964139 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01f8105a-29c8-4fef-806a-e1ed266242bf-run-httpd\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.983311 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-config-data\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:01 crc kubenswrapper[4845]: I1006 07:01:01.987351 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-scripts\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.007295 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.018993 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk9qf\" (UniqueName: \"kubernetes.io/projected/01f8105a-29c8-4fef-806a-e1ed266242bf-kube-api-access-gk9qf\") pod \"ceilometer-0\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " pod="openstack/ceilometer-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.042566 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-rdfkv"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.043646 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.045833 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qj9gj" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.046807 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.047245 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.053676 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.064433 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rdfkv"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.077445 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b56d5b9f-w6pjw"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.078234 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.152247 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69b48f67c-jvz42"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.152650 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4jjd\" (UniqueName: \"kubernetes.io/projected/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-kube-api-access-b4jjd\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.152712 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-combined-ca-bundle\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.152772 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-scripts\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.152851 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-config-data\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.152899 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-logs\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.158448 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.202015 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-rpmdt"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.212781 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.215041 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-65vsq" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.217367 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.217671 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.220618 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b48f67c-jvz42"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.245288 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d6456d757-m878m" podUID="b6c41f21-8adb-4524-9e06-d3591c6984cc" containerName="dnsmasq-dns" containerID="cri-o://a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867" gracePeriod=10 Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.256364 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-ovsdbserver-nb\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.256430 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4jjd\" (UniqueName: \"kubernetes.io/projected/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-kube-api-access-b4jjd\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.256468 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-combined-ca-bundle\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.256517 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-scripts\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.256563 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kxnk\" (UniqueName: \"kubernetes.io/projected/3c3fe663-8237-4fba-9801-a75d52661c35-kube-api-access-4kxnk\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.256599 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-dns-svc\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.256633 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-ovsdbserver-sb\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.256656 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-config-data\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.256673 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-dns-swift-storage-0\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.256712 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-logs\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.256737 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-config\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.261747 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-config-data\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.262020 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-logs\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.262053 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-scripts\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.265462 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-combined-ca-bundle\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.275926 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4jjd\" (UniqueName: \"kubernetes.io/projected/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-kube-api-access-b4jjd\") pod \"placement-db-sync-rdfkv\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.276566 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rpmdt"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.357949 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdv25\" (UniqueName: \"kubernetes.io/projected/00ff2f10-a7a1-458b-8a67-78879221e169-kube-api-access-zdv25\") pod \"neutron-db-sync-rpmdt\" (UID: \"00ff2f10-a7a1-458b-8a67-78879221e169\") " pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.358047 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ff2f10-a7a1-458b-8a67-78879221e169-combined-ca-bundle\") pod \"neutron-db-sync-rpmdt\" (UID: \"00ff2f10-a7a1-458b-8a67-78879221e169\") " pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.358097 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-ovsdbserver-nb\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.358176 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ff2f10-a7a1-458b-8a67-78879221e169-config\") pod \"neutron-db-sync-rpmdt\" (UID: \"00ff2f10-a7a1-458b-8a67-78879221e169\") " pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.358202 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kxnk\" (UniqueName: \"kubernetes.io/projected/3c3fe663-8237-4fba-9801-a75d52661c35-kube-api-access-4kxnk\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.358248 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-dns-svc\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.358281 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-ovsdbserver-sb\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.358301 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-dns-swift-storage-0\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.358351 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-config\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.359299 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-config\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.360473 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-ovsdbserver-nb\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.361255 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-ovsdbserver-sb\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.361972 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-dns-svc\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.362547 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-dns-swift-storage-0\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.369607 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.389345 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kxnk\" (UniqueName: \"kubernetes.io/projected/3c3fe663-8237-4fba-9801-a75d52661c35-kube-api-access-4kxnk\") pod \"dnsmasq-dns-69b48f67c-jvz42\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.463305 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdv25\" (UniqueName: \"kubernetes.io/projected/00ff2f10-a7a1-458b-8a67-78879221e169-kube-api-access-zdv25\") pod \"neutron-db-sync-rpmdt\" (UID: \"00ff2f10-a7a1-458b-8a67-78879221e169\") " pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.463411 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ff2f10-a7a1-458b-8a67-78879221e169-combined-ca-bundle\") pod \"neutron-db-sync-rpmdt\" (UID: \"00ff2f10-a7a1-458b-8a67-78879221e169\") " pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.463478 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ff2f10-a7a1-458b-8a67-78879221e169-config\") pod \"neutron-db-sync-rpmdt\" (UID: \"00ff2f10-a7a1-458b-8a67-78879221e169\") " pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.468261 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ff2f10-a7a1-458b-8a67-78879221e169-combined-ca-bundle\") pod \"neutron-db-sync-rpmdt\" (UID: \"00ff2f10-a7a1-458b-8a67-78879221e169\") " pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.470301 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ff2f10-a7a1-458b-8a67-78879221e169-config\") pod \"neutron-db-sync-rpmdt\" (UID: \"00ff2f10-a7a1-458b-8a67-78879221e169\") " pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.484670 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdv25\" (UniqueName: \"kubernetes.io/projected/00ff2f10-a7a1-458b-8a67-78879221e169-kube-api-access-zdv25\") pod \"neutron-db-sync-rpmdt\" (UID: \"00ff2f10-a7a1-458b-8a67-78879221e169\") " pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.567864 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.575487 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.602687 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w5zsc"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.643166 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.646228 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.653121 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.653914 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.654931 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.655319 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kvff7" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.676237 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.749606 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.751097 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.758122 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.758312 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.770760 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.770826 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.770848 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67930554-0731-4033-8f7b-00b365fcde1d-logs\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.770883 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.770901 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-config-data\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.770937 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-scripts\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.770968 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67930554-0731-4033-8f7b-00b365fcde1d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.770994 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgb27\" (UniqueName: \"kubernetes.io/projected/67930554-0731-4033-8f7b-00b365fcde1d-kube-api-access-xgb27\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.776700 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.790127 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.800450 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b56d5b9f-w6pjw"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.873877 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.873952 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.873973 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-config-data\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.874042 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-scripts\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.874149 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.874167 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3984290c-98c8-467d-82c3-49fd0007906f-logs\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.874193 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.874237 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67930554-0731-4033-8f7b-00b365fcde1d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.874258 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgb27\" (UniqueName: \"kubernetes.io/projected/67930554-0731-4033-8f7b-00b365fcde1d-kube-api-access-xgb27\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.874275 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khcfw\" (UniqueName: \"kubernetes.io/projected/3984290c-98c8-467d-82c3-49fd0007906f-kube-api-access-khcfw\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.874434 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.874439 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3984290c-98c8-467d-82c3-49fd0007906f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.876064 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.876770 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.877088 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.877139 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67930554-0731-4033-8f7b-00b365fcde1d-logs\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.877166 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.877963 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67930554-0731-4033-8f7b-00b365fcde1d-logs\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: W1006 07:01:02.893554 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01f8105a_29c8_4fef_806a_e1ed266242bf.slice/crio-cea49f75029dbf1db75154735fd9c6cecbd49c5ad37eaf733c727c2552dd75fc WatchSource:0}: Error finding container cea49f75029dbf1db75154735fd9c6cecbd49c5ad37eaf733c727c2552dd75fc: Status 404 returned error can't find the container with id cea49f75029dbf1db75154735fd9c6cecbd49c5ad37eaf733c727c2552dd75fc Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.893856 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67930554-0731-4033-8f7b-00b365fcde1d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.914005 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-config-data\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.920672 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.926514 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgb27\" (UniqueName: \"kubernetes.io/projected/67930554-0731-4033-8f7b-00b365fcde1d-kube-api-access-xgb27\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.926900 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-scripts\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.926971 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.942344 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.981180 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.981851 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.981927 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.982016 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.982037 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3984290c-98c8-467d-82c3-49fd0007906f-logs\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.982066 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.982101 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khcfw\" (UniqueName: \"kubernetes.io/projected/3984290c-98c8-467d-82c3-49fd0007906f-kube-api-access-khcfw\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.982275 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3984290c-98c8-467d-82c3-49fd0007906f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.982328 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.983609 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3984290c-98c8-467d-82c3-49fd0007906f-logs\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.983779 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.987046 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3984290c-98c8-467d-82c3-49fd0007906f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.992022 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.992177 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.993656 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rpmdt"] Oct 06 07:01:02 crc kubenswrapper[4845]: I1006 07:01:02.998156 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.006873 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.035264 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khcfw\" (UniqueName: \"kubernetes.io/projected/3984290c-98c8-467d-82c3-49fd0007906f-kube-api-access-khcfw\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.041415 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.085660 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rdfkv"] Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.266398 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.280858 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rdfkv" event={"ID":"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe","Type":"ContainerStarted","Data":"db4fae6f2bce933fbc73ccbd9b427057439d44f19b45cd52b77b4259f07820ba"} Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.281832 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.291896 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rpmdt" event={"ID":"00ff2f10-a7a1-458b-8a67-78879221e169","Type":"ContainerStarted","Data":"39ceb27c5bf83a80ffbd50f6782e4943ab7718f0ca44f87c6bc97265531dace2"} Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.294583 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" event={"ID":"2728015d-b72b-4da7-8de1-ff9afba21b1d","Type":"ContainerStarted","Data":"af61a85fd6483086629fdf06e8a575cf767530ca43ace497174f33a09451b908"} Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.303466 4845 generic.go:334] "Generic (PLEG): container finished" podID="b6c41f21-8adb-4524-9e06-d3591c6984cc" containerID="a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867" exitCode=0 Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.303564 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d6456d757-m878m" event={"ID":"b6c41f21-8adb-4524-9e06-d3591c6984cc","Type":"ContainerDied","Data":"a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867"} Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.303600 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d6456d757-m878m" event={"ID":"b6c41f21-8adb-4524-9e06-d3591c6984cc","Type":"ContainerDied","Data":"7dba3765e652326c9077a460b97e1352d1686099ca987c7f15c8f653b02826e2"} Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.303621 4845 scope.go:117] "RemoveContainer" containerID="a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.303819 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d6456d757-m878m" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.315907 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01f8105a-29c8-4fef-806a-e1ed266242bf","Type":"ContainerStarted","Data":"cea49f75029dbf1db75154735fd9c6cecbd49c5ad37eaf733c727c2552dd75fc"} Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.356470 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w5zsc" event={"ID":"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d","Type":"ContainerStarted","Data":"5d4b72908be1f108e06991a43da38797bc2bb78ee5327f9fc9c480b687f27f8b"} Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.363015 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b48f67c-jvz42"] Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.406010 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-dns-svc\") pod \"b6c41f21-8adb-4524-9e06-d3591c6984cc\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.406079 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-ovsdbserver-sb\") pod \"b6c41f21-8adb-4524-9e06-d3591c6984cc\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.406123 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-config\") pod \"b6c41f21-8adb-4524-9e06-d3591c6984cc\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.406201 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr4wr\" (UniqueName: \"kubernetes.io/projected/b6c41f21-8adb-4524-9e06-d3591c6984cc-kube-api-access-vr4wr\") pod \"b6c41f21-8adb-4524-9e06-d3591c6984cc\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.406280 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-dns-swift-storage-0\") pod \"b6c41f21-8adb-4524-9e06-d3591c6984cc\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.406441 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-ovsdbserver-nb\") pod \"b6c41f21-8adb-4524-9e06-d3591c6984cc\" (UID: \"b6c41f21-8adb-4524-9e06-d3591c6984cc\") " Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.415123 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c41f21-8adb-4524-9e06-d3591c6984cc-kube-api-access-vr4wr" (OuterVolumeSpecName: "kube-api-access-vr4wr") pod "b6c41f21-8adb-4524-9e06-d3591c6984cc" (UID: "b6c41f21-8adb-4524-9e06-d3591c6984cc"). InnerVolumeSpecName "kube-api-access-vr4wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:03 crc kubenswrapper[4845]: W1006 07:01:03.432785 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c3fe663_8237_4fba_9801_a75d52661c35.slice/crio-25a9f9a9ef776d3bbc3e2d9fb1569aa65968f28bef9fb05de2fd31b6290506be WatchSource:0}: Error finding container 25a9f9a9ef776d3bbc3e2d9fb1569aa65968f28bef9fb05de2fd31b6290506be: Status 404 returned error can't find the container with id 25a9f9a9ef776d3bbc3e2d9fb1569aa65968f28bef9fb05de2fd31b6290506be Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.433003 4845 scope.go:117] "RemoveContainer" containerID="080012ea237e71498d9f671c1c74bfd97239e568b3d8336f9c381044db32a910" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.483592 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.489434 4845 scope.go:117] "RemoveContainer" containerID="a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867" Oct 06 07:01:03 crc kubenswrapper[4845]: E1006 07:01:03.493029 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867\": container with ID starting with a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867 not found: ID does not exist" containerID="a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.493058 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867"} err="failed to get container status \"a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867\": rpc error: code = NotFound desc = could not find container \"a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867\": container with ID starting with a6e723b9cbbbea61c8ac3f4c45ad562de64ee4f6b676c4ea2d2caa4a5d0fc867 not found: ID does not exist" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.493084 4845 scope.go:117] "RemoveContainer" containerID="080012ea237e71498d9f671c1c74bfd97239e568b3d8336f9c381044db32a910" Oct 06 07:01:03 crc kubenswrapper[4845]: E1006 07:01:03.493582 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"080012ea237e71498d9f671c1c74bfd97239e568b3d8336f9c381044db32a910\": container with ID starting with 080012ea237e71498d9f671c1c74bfd97239e568b3d8336f9c381044db32a910 not found: ID does not exist" containerID="080012ea237e71498d9f671c1c74bfd97239e568b3d8336f9c381044db32a910" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.493604 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"080012ea237e71498d9f671c1c74bfd97239e568b3d8336f9c381044db32a910"} err="failed to get container status \"080012ea237e71498d9f671c1c74bfd97239e568b3d8336f9c381044db32a910\": rpc error: code = NotFound desc = could not find container \"080012ea237e71498d9f671c1c74bfd97239e568b3d8336f9c381044db32a910\": container with ID starting with 080012ea237e71498d9f671c1c74bfd97239e568b3d8336f9c381044db32a910 not found: ID does not exist" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.494390 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6c41f21-8adb-4524-9e06-d3591c6984cc" (UID: "b6c41f21-8adb-4524-9e06-d3591c6984cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.503787 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6c41f21-8adb-4524-9e06-d3591c6984cc" (UID: "b6c41f21-8adb-4524-9e06-d3591c6984cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.509801 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-config" (OuterVolumeSpecName: "config") pod "b6c41f21-8adb-4524-9e06-d3591c6984cc" (UID: "b6c41f21-8adb-4524-9e06-d3591c6984cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.510883 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.510901 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.510913 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr4wr\" (UniqueName: \"kubernetes.io/projected/b6c41f21-8adb-4524-9e06-d3591c6984cc-kube-api-access-vr4wr\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.510923 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.519182 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b6c41f21-8adb-4524-9e06-d3591c6984cc" (UID: "b6c41f21-8adb-4524-9e06-d3591c6984cc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.571308 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6c41f21-8adb-4524-9e06-d3591c6984cc" (UID: "b6c41f21-8adb-4524-9e06-d3591c6984cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.613391 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.613427 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c41f21-8adb-4524-9e06-d3591c6984cc-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.647330 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d6456d757-m878m"] Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.660984 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d6456d757-m878m"] Oct 06 07:01:03 crc kubenswrapper[4845]: I1006 07:01:03.977424 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.035541 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.048341 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.098821 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.264627 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c41f21-8adb-4524-9e06-d3591c6984cc" path="/var/lib/kubelet/pods/b6c41f21-8adb-4524-9e06-d3591c6984cc/volumes" Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.383519 4845 generic.go:334] "Generic (PLEG): container finished" podID="3c3fe663-8237-4fba-9801-a75d52661c35" containerID="c96ff7b3d53d39f363c7f6abce78f9345f1320b4288a580615ea1bf6635864af" exitCode=0 Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.383871 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" event={"ID":"3c3fe663-8237-4fba-9801-a75d52661c35","Type":"ContainerDied","Data":"c96ff7b3d53d39f363c7f6abce78f9345f1320b4288a580615ea1bf6635864af"} Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.383899 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" event={"ID":"3c3fe663-8237-4fba-9801-a75d52661c35","Type":"ContainerStarted","Data":"25a9f9a9ef776d3bbc3e2d9fb1569aa65968f28bef9fb05de2fd31b6290506be"} Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.396682 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w5zsc" event={"ID":"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d","Type":"ContainerStarted","Data":"6ca31b3bb3eca616eed65099486e1d33a2363ee9d6718c2929707499cb202e23"} Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.400385 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rpmdt" event={"ID":"00ff2f10-a7a1-458b-8a67-78879221e169","Type":"ContainerStarted","Data":"58caf3b6149fbdb8593cc52903e01819229bfcd094c10597ff05d55cb3e14416"} Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.406018 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67930554-0731-4033-8f7b-00b365fcde1d","Type":"ContainerStarted","Data":"5f4f1c05f8fc754916d63e48765c126e83b2361cb02fd39fa811154b8b6d9402"} Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.422650 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-rpmdt" podStartSLOduration=2.422630634 podStartE2EDuration="2.422630634s" podCreationTimestamp="2025-10-06 07:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:04.422364397 +0000 UTC m=+948.937105405" watchObservedRunningTime="2025-10-06 07:01:04.422630634 +0000 UTC m=+948.937371642" Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.425477 4845 generic.go:334] "Generic (PLEG): container finished" podID="2728015d-b72b-4da7-8de1-ff9afba21b1d" containerID="233f35210575ac0cc6cdd9c91d87abd3b23373c6863e8098cca0d5c698d6a623" exitCode=0 Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.425690 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" event={"ID":"2728015d-b72b-4da7-8de1-ff9afba21b1d","Type":"ContainerDied","Data":"233f35210575ac0cc6cdd9c91d87abd3b23373c6863e8098cca0d5c698d6a623"} Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.430305 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3984290c-98c8-467d-82c3-49fd0007906f","Type":"ContainerStarted","Data":"c5bdaeff61ba1dbb6f01f52e36810cc0e67f47eb59ffdd1567b48ee1f4ef1ab6"} Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.461802 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-w5zsc" podStartSLOduration=3.46177335 podStartE2EDuration="3.46177335s" podCreationTimestamp="2025-10-06 07:01:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:04.435936166 +0000 UTC m=+948.950677174" watchObservedRunningTime="2025-10-06 07:01:04.46177335 +0000 UTC m=+948.976514358" Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.866263 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.955134 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-dns-svc\") pod \"2728015d-b72b-4da7-8de1-ff9afba21b1d\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.955208 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-config\") pod \"2728015d-b72b-4da7-8de1-ff9afba21b1d\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.955252 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-dns-swift-storage-0\") pod \"2728015d-b72b-4da7-8de1-ff9afba21b1d\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.955424 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-ovsdbserver-nb\") pod \"2728015d-b72b-4da7-8de1-ff9afba21b1d\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.955486 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6942\" (UniqueName: \"kubernetes.io/projected/2728015d-b72b-4da7-8de1-ff9afba21b1d-kube-api-access-r6942\") pod \"2728015d-b72b-4da7-8de1-ff9afba21b1d\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.955524 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-ovsdbserver-sb\") pod \"2728015d-b72b-4da7-8de1-ff9afba21b1d\" (UID: \"2728015d-b72b-4da7-8de1-ff9afba21b1d\") " Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.982210 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2728015d-b72b-4da7-8de1-ff9afba21b1d" (UID: "2728015d-b72b-4da7-8de1-ff9afba21b1d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.989885 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2728015d-b72b-4da7-8de1-ff9afba21b1d-kube-api-access-r6942" (OuterVolumeSpecName: "kube-api-access-r6942") pod "2728015d-b72b-4da7-8de1-ff9afba21b1d" (UID: "2728015d-b72b-4da7-8de1-ff9afba21b1d"). InnerVolumeSpecName "kube-api-access-r6942". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.992233 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2728015d-b72b-4da7-8de1-ff9afba21b1d" (UID: "2728015d-b72b-4da7-8de1-ff9afba21b1d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:04 crc kubenswrapper[4845]: I1006 07:01:04.995515 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2728015d-b72b-4da7-8de1-ff9afba21b1d" (UID: "2728015d-b72b-4da7-8de1-ff9afba21b1d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.008795 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-config" (OuterVolumeSpecName: "config") pod "2728015d-b72b-4da7-8de1-ff9afba21b1d" (UID: "2728015d-b72b-4da7-8de1-ff9afba21b1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.061314 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.061356 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.061366 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.061409 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6942\" (UniqueName: \"kubernetes.io/projected/2728015d-b72b-4da7-8de1-ff9afba21b1d-kube-api-access-r6942\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.061421 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.063755 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2728015d-b72b-4da7-8de1-ff9afba21b1d" (UID: "2728015d-b72b-4da7-8de1-ff9afba21b1d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.163459 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2728015d-b72b-4da7-8de1-ff9afba21b1d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.443778 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.448440 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b56d5b9f-w6pjw" event={"ID":"2728015d-b72b-4da7-8de1-ff9afba21b1d","Type":"ContainerDied","Data":"af61a85fd6483086629fdf06e8a575cf767530ca43ace497174f33a09451b908"} Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.448516 4845 scope.go:117] "RemoveContainer" containerID="233f35210575ac0cc6cdd9c91d87abd3b23373c6863e8098cca0d5c698d6a623" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.461060 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67930554-0731-4033-8f7b-00b365fcde1d","Type":"ContainerStarted","Data":"5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc"} Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.461110 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67930554-0731-4033-8f7b-00b365fcde1d","Type":"ContainerStarted","Data":"70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd"} Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.461273 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="67930554-0731-4033-8f7b-00b365fcde1d" containerName="glance-log" containerID="cri-o://70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd" gracePeriod=30 Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.461688 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="67930554-0731-4033-8f7b-00b365fcde1d" containerName="glance-httpd" containerID="cri-o://5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc" gracePeriod=30 Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.489043 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3984290c-98c8-467d-82c3-49fd0007906f","Type":"ContainerStarted","Data":"5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8"} Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.495606 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" event={"ID":"3c3fe663-8237-4fba-9801-a75d52661c35","Type":"ContainerStarted","Data":"72beee165d533fbf20361b0a48c794946dc11415ccd65537901e2ffd2bde35cc"} Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.495648 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.502955 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b56d5b9f-w6pjw"] Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.537384 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57b56d5b9f-w6pjw"] Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.541442 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.541421361 podStartE2EDuration="4.541421361s" podCreationTimestamp="2025-10-06 07:01:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:05.522980567 +0000 UTC m=+950.037721575" watchObservedRunningTime="2025-10-06 07:01:05.541421361 +0000 UTC m=+950.056162369" Oct 06 07:01:05 crc kubenswrapper[4845]: I1006 07:01:05.566726 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" podStartSLOduration=3.566700371 podStartE2EDuration="3.566700371s" podCreationTimestamp="2025-10-06 07:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:05.543074013 +0000 UTC m=+950.057815021" watchObservedRunningTime="2025-10-06 07:01:05.566700371 +0000 UTC m=+950.081441379" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.073641 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.192881 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgb27\" (UniqueName: \"kubernetes.io/projected/67930554-0731-4033-8f7b-00b365fcde1d-kube-api-access-xgb27\") pod \"67930554-0731-4033-8f7b-00b365fcde1d\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.192977 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-combined-ca-bundle\") pod \"67930554-0731-4033-8f7b-00b365fcde1d\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.193071 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-scripts\") pod \"67930554-0731-4033-8f7b-00b365fcde1d\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.193089 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"67930554-0731-4033-8f7b-00b365fcde1d\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.193140 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-public-tls-certs\") pod \"67930554-0731-4033-8f7b-00b365fcde1d\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.193179 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-config-data\") pod \"67930554-0731-4033-8f7b-00b365fcde1d\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.193237 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67930554-0731-4033-8f7b-00b365fcde1d-logs\") pod \"67930554-0731-4033-8f7b-00b365fcde1d\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.193280 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67930554-0731-4033-8f7b-00b365fcde1d-httpd-run\") pod \"67930554-0731-4033-8f7b-00b365fcde1d\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.194083 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67930554-0731-4033-8f7b-00b365fcde1d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "67930554-0731-4033-8f7b-00b365fcde1d" (UID: "67930554-0731-4033-8f7b-00b365fcde1d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.194312 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67930554-0731-4033-8f7b-00b365fcde1d-logs" (OuterVolumeSpecName: "logs") pod "67930554-0731-4033-8f7b-00b365fcde1d" (UID: "67930554-0731-4033-8f7b-00b365fcde1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.200053 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-scripts" (OuterVolumeSpecName: "scripts") pod "67930554-0731-4033-8f7b-00b365fcde1d" (UID: "67930554-0731-4033-8f7b-00b365fcde1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.200909 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67930554-0731-4033-8f7b-00b365fcde1d-kube-api-access-xgb27" (OuterVolumeSpecName: "kube-api-access-xgb27") pod "67930554-0731-4033-8f7b-00b365fcde1d" (UID: "67930554-0731-4033-8f7b-00b365fcde1d"). InnerVolumeSpecName "kube-api-access-xgb27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.224602 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "67930554-0731-4033-8f7b-00b365fcde1d" (UID: "67930554-0731-4033-8f7b-00b365fcde1d"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.262593 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2728015d-b72b-4da7-8de1-ff9afba21b1d" path="/var/lib/kubelet/pods/2728015d-b72b-4da7-8de1-ff9afba21b1d/volumes" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.266549 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67930554-0731-4033-8f7b-00b365fcde1d" (UID: "67930554-0731-4033-8f7b-00b365fcde1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:06 crc kubenswrapper[4845]: E1006 07:01:06.283829 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-config-data podName:67930554-0731-4033-8f7b-00b365fcde1d nodeName:}" failed. No retries permitted until 2025-10-06 07:01:06.783802009 +0000 UTC m=+951.298543017 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-config-data") pod "67930554-0731-4033-8f7b-00b365fcde1d" (UID: "67930554-0731-4033-8f7b-00b365fcde1d") : error deleting /var/lib/kubelet/pods/67930554-0731-4033-8f7b-00b365fcde1d/volume-subpaths: remove /var/lib/kubelet/pods/67930554-0731-4033-8f7b-00b365fcde1d/volume-subpaths: no such file or directory Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.290345 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "67930554-0731-4033-8f7b-00b365fcde1d" (UID: "67930554-0731-4033-8f7b-00b365fcde1d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.296820 4845 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.296848 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67930554-0731-4033-8f7b-00b365fcde1d-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.296857 4845 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67930554-0731-4033-8f7b-00b365fcde1d-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.296866 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgb27\" (UniqueName: \"kubernetes.io/projected/67930554-0731-4033-8f7b-00b365fcde1d-kube-api-access-xgb27\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.296876 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.296885 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.296902 4845 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.327001 4845 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.398718 4845 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.491909 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ec2e-account-create-sxrxb"] Oct 06 07:01:06 crc kubenswrapper[4845]: E1006 07:01:06.492332 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c41f21-8adb-4524-9e06-d3591c6984cc" containerName="init" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.492348 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c41f21-8adb-4524-9e06-d3591c6984cc" containerName="init" Oct 06 07:01:06 crc kubenswrapper[4845]: E1006 07:01:06.492367 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67930554-0731-4033-8f7b-00b365fcde1d" containerName="glance-httpd" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.492392 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="67930554-0731-4033-8f7b-00b365fcde1d" containerName="glance-httpd" Oct 06 07:01:06 crc kubenswrapper[4845]: E1006 07:01:06.492407 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67930554-0731-4033-8f7b-00b365fcde1d" containerName="glance-log" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.492422 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="67930554-0731-4033-8f7b-00b365fcde1d" containerName="glance-log" Oct 06 07:01:06 crc kubenswrapper[4845]: E1006 07:01:06.492437 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2728015d-b72b-4da7-8de1-ff9afba21b1d" containerName="init" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.492444 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2728015d-b72b-4da7-8de1-ff9afba21b1d" containerName="init" Oct 06 07:01:06 crc kubenswrapper[4845]: E1006 07:01:06.492467 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c41f21-8adb-4524-9e06-d3591c6984cc" containerName="dnsmasq-dns" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.492475 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c41f21-8adb-4524-9e06-d3591c6984cc" containerName="dnsmasq-dns" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.492672 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="67930554-0731-4033-8f7b-00b365fcde1d" containerName="glance-log" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.492689 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c41f21-8adb-4524-9e06-d3591c6984cc" containerName="dnsmasq-dns" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.492747 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2728015d-b72b-4da7-8de1-ff9afba21b1d" containerName="init" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.492765 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="67930554-0731-4033-8f7b-00b365fcde1d" containerName="glance-httpd" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.493353 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ec2e-account-create-sxrxb" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.495407 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.501482 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ec2e-account-create-sxrxb"] Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.535296 4845 generic.go:334] "Generic (PLEG): container finished" podID="67930554-0731-4033-8f7b-00b365fcde1d" containerID="5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc" exitCode=143 Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.535326 4845 generic.go:334] "Generic (PLEG): container finished" podID="67930554-0731-4033-8f7b-00b365fcde1d" containerID="70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd" exitCode=143 Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.535366 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67930554-0731-4033-8f7b-00b365fcde1d","Type":"ContainerDied","Data":"5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc"} Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.535406 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67930554-0731-4033-8f7b-00b365fcde1d","Type":"ContainerDied","Data":"70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd"} Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.535417 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67930554-0731-4033-8f7b-00b365fcde1d","Type":"ContainerDied","Data":"5f4f1c05f8fc754916d63e48765c126e83b2361cb02fd39fa811154b8b6d9402"} Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.535433 4845 scope.go:117] "RemoveContainer" containerID="5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.535582 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.539634 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3984290c-98c8-467d-82c3-49fd0007906f","Type":"ContainerStarted","Data":"70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8"} Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.539766 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3984290c-98c8-467d-82c3-49fd0007906f" containerName="glance-log" containerID="cri-o://5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8" gracePeriod=30 Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.540134 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3984290c-98c8-467d-82c3-49fd0007906f" containerName="glance-httpd" containerID="cri-o://70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8" gracePeriod=30 Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.602723 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj5ss\" (UniqueName: \"kubernetes.io/projected/65f7cd79-f2bf-4927-9967-a6f5ccd42bdf-kube-api-access-xj5ss\") pod \"barbican-ec2e-account-create-sxrxb\" (UID: \"65f7cd79-f2bf-4927-9967-a6f5ccd42bdf\") " pod="openstack/barbican-ec2e-account-create-sxrxb" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.607024 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.60700015 podStartE2EDuration="5.60700015s" podCreationTimestamp="2025-10-06 07:01:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:06.601681883 +0000 UTC m=+951.116422901" watchObservedRunningTime="2025-10-06 07:01:06.60700015 +0000 UTC m=+951.121741148" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.667346 4845 scope.go:117] "RemoveContainer" containerID="70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.704643 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj5ss\" (UniqueName: \"kubernetes.io/projected/65f7cd79-f2bf-4927-9967-a6f5ccd42bdf-kube-api-access-xj5ss\") pod \"barbican-ec2e-account-create-sxrxb\" (UID: \"65f7cd79-f2bf-4927-9967-a6f5ccd42bdf\") " pod="openstack/barbican-ec2e-account-create-sxrxb" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.706194 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a311-account-create-nqw7t"] Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.707326 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a311-account-create-nqw7t" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.718021 4845 scope.go:117] "RemoveContainer" containerID="5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.718243 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 06 07:01:06 crc kubenswrapper[4845]: E1006 07:01:06.722938 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc\": container with ID starting with 5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc not found: ID does not exist" containerID="5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.722978 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc"} err="failed to get container status \"5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc\": rpc error: code = NotFound desc = could not find container \"5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc\": container with ID starting with 5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc not found: ID does not exist" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.723000 4845 scope.go:117] "RemoveContainer" containerID="70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.723734 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a311-account-create-nqw7t"] Oct 06 07:01:06 crc kubenswrapper[4845]: E1006 07:01:06.725196 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd\": container with ID starting with 70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd not found: ID does not exist" containerID="70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.725233 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd"} err="failed to get container status \"70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd\": rpc error: code = NotFound desc = could not find container \"70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd\": container with ID starting with 70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd not found: ID does not exist" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.725258 4845 scope.go:117] "RemoveContainer" containerID="5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.739296 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj5ss\" (UniqueName: \"kubernetes.io/projected/65f7cd79-f2bf-4927-9967-a6f5ccd42bdf-kube-api-access-xj5ss\") pod \"barbican-ec2e-account-create-sxrxb\" (UID: \"65f7cd79-f2bf-4927-9967-a6f5ccd42bdf\") " pod="openstack/barbican-ec2e-account-create-sxrxb" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.747650 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc"} err="failed to get container status \"5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc\": rpc error: code = NotFound desc = could not find container \"5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc\": container with ID starting with 5557add4845744002f0054b8ffc081bbc796a4ca3622fcb2a9b70d3f720e12dc not found: ID does not exist" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.747690 4845 scope.go:117] "RemoveContainer" containerID="70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.748561 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd"} err="failed to get container status \"70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd\": rpc error: code = NotFound desc = could not find container \"70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd\": container with ID starting with 70c7c439c3df5202386f5cfcb9c8a822da9f1324f889cb3fd81243c8c35ad0bd not found: ID does not exist" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.805344 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-config-data\") pod \"67930554-0731-4033-8f7b-00b365fcde1d\" (UID: \"67930554-0731-4033-8f7b-00b365fcde1d\") " Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.805899 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbf68\" (UniqueName: \"kubernetes.io/projected/7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27-kube-api-access-rbf68\") pod \"cinder-a311-account-create-nqw7t\" (UID: \"7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27\") " pod="openstack/cinder-a311-account-create-nqw7t" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.811865 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-config-data" (OuterVolumeSpecName: "config-data") pod "67930554-0731-4033-8f7b-00b365fcde1d" (UID: "67930554-0731-4033-8f7b-00b365fcde1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.813188 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ec2e-account-create-sxrxb" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.875318 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.894431 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.910561 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbf68\" (UniqueName: \"kubernetes.io/projected/7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27-kube-api-access-rbf68\") pod \"cinder-a311-account-create-nqw7t\" (UID: \"7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27\") " pod="openstack/cinder-a311-account-create-nqw7t" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.910648 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67930554-0731-4033-8f7b-00b365fcde1d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.918690 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.920206 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.922923 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.923104 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.924620 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:01:06 crc kubenswrapper[4845]: I1006 07:01:06.967904 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbf68\" (UniqueName: \"kubernetes.io/projected/7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27-kube-api-access-rbf68\") pod \"cinder-a311-account-create-nqw7t\" (UID: \"7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27\") " pod="openstack/cinder-a311-account-create-nqw7t" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.012028 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.012101 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-config-data\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.012123 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.012165 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-scripts\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.012191 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8smt\" (UniqueName: \"kubernetes.io/projected/f407712f-9d26-4102-a0c5-e7bfc61800ec-kube-api-access-h8smt\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.012237 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.012262 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f407712f-9d26-4102-a0c5-e7bfc61800ec-logs\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.012292 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f407712f-9d26-4102-a0c5-e7bfc61800ec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.033327 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a311-account-create-nqw7t" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.114130 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8smt\" (UniqueName: \"kubernetes.io/projected/f407712f-9d26-4102-a0c5-e7bfc61800ec-kube-api-access-h8smt\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.114264 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.114303 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f407712f-9d26-4102-a0c5-e7bfc61800ec-logs\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.114358 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f407712f-9d26-4102-a0c5-e7bfc61800ec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.114404 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.114449 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-config-data\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.114490 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.114567 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-scripts\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.115035 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f407712f-9d26-4102-a0c5-e7bfc61800ec-logs\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.116256 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f407712f-9d26-4102-a0c5-e7bfc61800ec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.116642 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.124052 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.125943 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-config-data\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.126617 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.127004 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-scripts\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.137295 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8smt\" (UniqueName: \"kubernetes.io/projected/f407712f-9d26-4102-a0c5-e7bfc61800ec-kube-api-access-h8smt\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.158967 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.247157 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.417970 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.440985 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ec2e-account-create-sxrxb"] Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.524515 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-scripts\") pod \"3984290c-98c8-467d-82c3-49fd0007906f\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.524561 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-internal-tls-certs\") pod \"3984290c-98c8-467d-82c3-49fd0007906f\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.524809 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-combined-ca-bundle\") pod \"3984290c-98c8-467d-82c3-49fd0007906f\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.524862 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-config-data\") pod \"3984290c-98c8-467d-82c3-49fd0007906f\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.524997 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khcfw\" (UniqueName: \"kubernetes.io/projected/3984290c-98c8-467d-82c3-49fd0007906f-kube-api-access-khcfw\") pod \"3984290c-98c8-467d-82c3-49fd0007906f\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.525088 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"3984290c-98c8-467d-82c3-49fd0007906f\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.525114 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3984290c-98c8-467d-82c3-49fd0007906f-logs\") pod \"3984290c-98c8-467d-82c3-49fd0007906f\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.525185 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3984290c-98c8-467d-82c3-49fd0007906f-httpd-run\") pod \"3984290c-98c8-467d-82c3-49fd0007906f\" (UID: \"3984290c-98c8-467d-82c3-49fd0007906f\") " Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.526563 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3984290c-98c8-467d-82c3-49fd0007906f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3984290c-98c8-467d-82c3-49fd0007906f" (UID: "3984290c-98c8-467d-82c3-49fd0007906f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.527118 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3984290c-98c8-467d-82c3-49fd0007906f-logs" (OuterVolumeSpecName: "logs") pod "3984290c-98c8-467d-82c3-49fd0007906f" (UID: "3984290c-98c8-467d-82c3-49fd0007906f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.530003 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-scripts" (OuterVolumeSpecName: "scripts") pod "3984290c-98c8-467d-82c3-49fd0007906f" (UID: "3984290c-98c8-467d-82c3-49fd0007906f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.531328 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3984290c-98c8-467d-82c3-49fd0007906f-kube-api-access-khcfw" (OuterVolumeSpecName: "kube-api-access-khcfw") pod "3984290c-98c8-467d-82c3-49fd0007906f" (UID: "3984290c-98c8-467d-82c3-49fd0007906f"). InnerVolumeSpecName "kube-api-access-khcfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.534987 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "3984290c-98c8-467d-82c3-49fd0007906f" (UID: "3984290c-98c8-467d-82c3-49fd0007906f"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.562772 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a311-account-create-nqw7t"] Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.565553 4845 generic.go:334] "Generic (PLEG): container finished" podID="3984290c-98c8-467d-82c3-49fd0007906f" containerID="70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8" exitCode=0 Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.565584 4845 generic.go:334] "Generic (PLEG): container finished" podID="3984290c-98c8-467d-82c3-49fd0007906f" containerID="5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8" exitCode=143 Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.565606 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3984290c-98c8-467d-82c3-49fd0007906f","Type":"ContainerDied","Data":"70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8"} Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.565658 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3984290c-98c8-467d-82c3-49fd0007906f","Type":"ContainerDied","Data":"5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8"} Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.565685 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3984290c-98c8-467d-82c3-49fd0007906f","Type":"ContainerDied","Data":"c5bdaeff61ba1dbb6f01f52e36810cc0e67f47eb59ffdd1567b48ee1f4ef1ab6"} Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.565690 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.565705 4845 scope.go:117] "RemoveContainer" containerID="70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.574510 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3984290c-98c8-467d-82c3-49fd0007906f" (UID: "3984290c-98c8-467d-82c3-49fd0007906f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.586834 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3984290c-98c8-467d-82c3-49fd0007906f" (UID: "3984290c-98c8-467d-82c3-49fd0007906f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.587249 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-config-data" (OuterVolumeSpecName: "config-data") pod "3984290c-98c8-467d-82c3-49fd0007906f" (UID: "3984290c-98c8-467d-82c3-49fd0007906f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.628334 4845 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3984290c-98c8-467d-82c3-49fd0007906f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.628393 4845 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.628409 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.628422 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.628434 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3984290c-98c8-467d-82c3-49fd0007906f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.628444 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khcfw\" (UniqueName: \"kubernetes.io/projected/3984290c-98c8-467d-82c3-49fd0007906f-kube-api-access-khcfw\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.628486 4845 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.628499 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3984290c-98c8-467d-82c3-49fd0007906f-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.662827 4845 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.729678 4845 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.895527 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.902530 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.910260 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.928642 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:01:07 crc kubenswrapper[4845]: E1006 07:01:07.929026 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3984290c-98c8-467d-82c3-49fd0007906f" containerName="glance-log" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.929039 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3984290c-98c8-467d-82c3-49fd0007906f" containerName="glance-log" Oct 06 07:01:07 crc kubenswrapper[4845]: E1006 07:01:07.929057 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3984290c-98c8-467d-82c3-49fd0007906f" containerName="glance-httpd" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.929064 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3984290c-98c8-467d-82c3-49fd0007906f" containerName="glance-httpd" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.929231 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3984290c-98c8-467d-82c3-49fd0007906f" containerName="glance-log" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.929256 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3984290c-98c8-467d-82c3-49fd0007906f" containerName="glance-httpd" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.930132 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.931721 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.936699 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 07:01:07 crc kubenswrapper[4845]: I1006 07:01:07.943384 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.034944 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74662789-933b-40b7-8e5b-2a3e32ba08f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.034994 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74662789-933b-40b7-8e5b-2a3e32ba08f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.035029 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.035116 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.035250 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7x2h\" (UniqueName: \"kubernetes.io/projected/74662789-933b-40b7-8e5b-2a3e32ba08f4-kube-api-access-t7x2h\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.035314 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.035408 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.035476 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.137277 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.137328 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.137392 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7x2h\" (UniqueName: \"kubernetes.io/projected/74662789-933b-40b7-8e5b-2a3e32ba08f4-kube-api-access-t7x2h\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.137417 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.137442 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.137468 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.137547 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74662789-933b-40b7-8e5b-2a3e32ba08f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.137570 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74662789-933b-40b7-8e5b-2a3e32ba08f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.137880 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.138031 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74662789-933b-40b7-8e5b-2a3e32ba08f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.138251 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74662789-933b-40b7-8e5b-2a3e32ba08f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.143670 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.143773 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.144010 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.144021 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.164244 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7x2h\" (UniqueName: \"kubernetes.io/projected/74662789-933b-40b7-8e5b-2a3e32ba08f4-kube-api-access-t7x2h\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.171583 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.241051 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3984290c-98c8-467d-82c3-49fd0007906f" path="/var/lib/kubelet/pods/3984290c-98c8-467d-82c3-49fd0007906f/volumes" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.241930 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67930554-0731-4033-8f7b-00b365fcde1d" path="/var/lib/kubelet/pods/67930554-0731-4033-8f7b-00b365fcde1d/volumes" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.248988 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.579211 4845 generic.go:334] "Generic (PLEG): container finished" podID="8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d" containerID="6ca31b3bb3eca616eed65099486e1d33a2363ee9d6718c2929707499cb202e23" exitCode=0 Oct 06 07:01:08 crc kubenswrapper[4845]: I1006 07:01:08.579253 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w5zsc" event={"ID":"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d","Type":"ContainerDied","Data":"6ca31b3bb3eca616eed65099486e1d33a2363ee9d6718c2929707499cb202e23"} Oct 06 07:01:09 crc kubenswrapper[4845]: W1006 07:01:09.976164 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65f7cd79_f2bf_4927_9967_a6f5ccd42bdf.slice/crio-029cba906c989c4d856a403d30f29a3295fc78f44e2f2d499e3e71ab97cff6ed WatchSource:0}: Error finding container 029cba906c989c4d856a403d30f29a3295fc78f44e2f2d499e3e71ab97cff6ed: Status 404 returned error can't find the container with id 029cba906c989c4d856a403d30f29a3295fc78f44e2f2d499e3e71ab97cff6ed Oct 06 07:01:09 crc kubenswrapper[4845]: W1006 07:01:09.988122 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7594b2d3_25d9_41e7_8b9d_d3c7ad1c3d27.slice/crio-844cb5a52fccbbd603e24bc7bc1da4e66d5298b631b01195bcc52c1060773a5a WatchSource:0}: Error finding container 844cb5a52fccbbd603e24bc7bc1da4e66d5298b631b01195bcc52c1060773a5a: Status 404 returned error can't find the container with id 844cb5a52fccbbd603e24bc7bc1da4e66d5298b631b01195bcc52c1060773a5a Oct 06 07:01:10 crc kubenswrapper[4845]: I1006 07:01:10.597036 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a311-account-create-nqw7t" event={"ID":"7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27","Type":"ContainerStarted","Data":"844cb5a52fccbbd603e24bc7bc1da4e66d5298b631b01195bcc52c1060773a5a"} Oct 06 07:01:10 crc kubenswrapper[4845]: I1006 07:01:10.597895 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ec2e-account-create-sxrxb" event={"ID":"65f7cd79-f2bf-4927-9967-a6f5ccd42bdf","Type":"ContainerStarted","Data":"029cba906c989c4d856a403d30f29a3295fc78f44e2f2d499e3e71ab97cff6ed"} Oct 06 07:01:11 crc kubenswrapper[4845]: I1006 07:01:11.949957 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.002628 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-config-data\") pod \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.002728 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-fernet-keys\") pod \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.002766 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-credential-keys\") pod \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.002839 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk4xs\" (UniqueName: \"kubernetes.io/projected/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-kube-api-access-pk4xs\") pod \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.002879 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-scripts\") pod \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.002961 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-combined-ca-bundle\") pod \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\" (UID: \"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d\") " Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.008719 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-scripts" (OuterVolumeSpecName: "scripts") pod "8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d" (UID: "8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.008776 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-kube-api-access-pk4xs" (OuterVolumeSpecName: "kube-api-access-pk4xs") pod "8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d" (UID: "8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d"). InnerVolumeSpecName "kube-api-access-pk4xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.009079 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d" (UID: "8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.019416 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d" (UID: "8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.042588 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-config-data" (OuterVolumeSpecName: "config-data") pod "8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d" (UID: "8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.046525 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d" (UID: "8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.104869 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.104900 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.104910 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.104919 4845 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.104927 4845 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.104936 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk4xs\" (UniqueName: \"kubernetes.io/projected/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d-kube-api-access-pk4xs\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.470068 4845 scope.go:117] "RemoveContainer" containerID="5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.521298 4845 scope.go:117] "RemoveContainer" containerID="70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8" Oct 06 07:01:12 crc kubenswrapper[4845]: E1006 07:01:12.521987 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8\": container with ID starting with 70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8 not found: ID does not exist" containerID="70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.522019 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8"} err="failed to get container status \"70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8\": rpc error: code = NotFound desc = could not find container \"70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8\": container with ID starting with 70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8 not found: ID does not exist" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.522039 4845 scope.go:117] "RemoveContainer" containerID="5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8" Oct 06 07:01:12 crc kubenswrapper[4845]: E1006 07:01:12.522441 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8\": container with ID starting with 5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8 not found: ID does not exist" containerID="5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.522462 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8"} err="failed to get container status \"5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8\": rpc error: code = NotFound desc = could not find container \"5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8\": container with ID starting with 5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8 not found: ID does not exist" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.522474 4845 scope.go:117] "RemoveContainer" containerID="70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.523448 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8"} err="failed to get container status \"70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8\": rpc error: code = NotFound desc = could not find container \"70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8\": container with ID starting with 70bfc33d214cd26293475b91f06f61e2e586246fbab758aefb9455d090fa81d8 not found: ID does not exist" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.523475 4845 scope.go:117] "RemoveContainer" containerID="5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.524559 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8"} err="failed to get container status \"5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8\": rpc error: code = NotFound desc = could not find container \"5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8\": container with ID starting with 5cb89ab3617226c212603dd26b74b8260033e0f338de3b14e483969ac84603e8 not found: ID does not exist" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.572569 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.654606 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w5zsc" event={"ID":"8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d","Type":"ContainerDied","Data":"5d4b72908be1f108e06991a43da38797bc2bb78ee5327f9fc9c480b687f27f8b"} Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.654636 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d4b72908be1f108e06991a43da38797bc2bb78ee5327f9fc9c480b687f27f8b" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.654691 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w5zsc" Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.657384 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f407712f-9d26-4102-a0c5-e7bfc61800ec","Type":"ContainerStarted","Data":"5394291eb190a1e54fef3c4d3ef8efe0723abeb74dbc8b36eab5a343959aedfc"} Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.670795 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775fbffbc7-qzpjj"] Oct 06 07:01:12 crc kubenswrapper[4845]: I1006 07:01:12.671017 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" podUID="c4678683-6339-4d33-8297-76bfb9f58bb0" containerName="dnsmasq-dns" containerID="cri-o://9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d" gracePeriod=10 Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.050808 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-w5zsc"] Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.064406 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-w5zsc"] Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.148465 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rd54p"] Oct 06 07:01:13 crc kubenswrapper[4845]: E1006 07:01:13.148889 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d" containerName="keystone-bootstrap" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.148909 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d" containerName="keystone-bootstrap" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.149821 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d" containerName="keystone-bootstrap" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.150473 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.154986 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.155247 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.155430 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xw6wh" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.162669 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.170683 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rd54p"] Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.197417 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.233573 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-scripts\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.233636 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-config-data\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.233722 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-fernet-keys\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.234003 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-credential-keys\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.234104 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdtjm\" (UniqueName: \"kubernetes.io/projected/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-kube-api-access-kdtjm\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.234131 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-combined-ca-bundle\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.313444 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.336190 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-scripts\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.336236 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-config-data\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.336309 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-fernet-keys\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.336391 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-credential-keys\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.336431 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdtjm\" (UniqueName: \"kubernetes.io/projected/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-kube-api-access-kdtjm\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.336449 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-combined-ca-bundle\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.346687 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-scripts\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.348662 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-fernet-keys\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.349775 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-credential-keys\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.358902 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-config-data\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.363406 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-combined-ca-bundle\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.404184 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdtjm\" (UniqueName: \"kubernetes.io/projected/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-kube-api-access-kdtjm\") pod \"keystone-bootstrap-rd54p\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.437058 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frfnk\" (UniqueName: \"kubernetes.io/projected/c4678683-6339-4d33-8297-76bfb9f58bb0-kube-api-access-frfnk\") pod \"c4678683-6339-4d33-8297-76bfb9f58bb0\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.437095 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-ovsdbserver-nb\") pod \"c4678683-6339-4d33-8297-76bfb9f58bb0\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.437113 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-ovsdbserver-sb\") pod \"c4678683-6339-4d33-8297-76bfb9f58bb0\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.437135 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-dns-swift-storage-0\") pod \"c4678683-6339-4d33-8297-76bfb9f58bb0\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.437155 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-dns-svc\") pod \"c4678683-6339-4d33-8297-76bfb9f58bb0\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.437186 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-config\") pod \"c4678683-6339-4d33-8297-76bfb9f58bb0\" (UID: \"c4678683-6339-4d33-8297-76bfb9f58bb0\") " Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.442595 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4678683-6339-4d33-8297-76bfb9f58bb0-kube-api-access-frfnk" (OuterVolumeSpecName: "kube-api-access-frfnk") pod "c4678683-6339-4d33-8297-76bfb9f58bb0" (UID: "c4678683-6339-4d33-8297-76bfb9f58bb0"). InnerVolumeSpecName "kube-api-access-frfnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.487394 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4678683-6339-4d33-8297-76bfb9f58bb0" (UID: "c4678683-6339-4d33-8297-76bfb9f58bb0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.493945 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4678683-6339-4d33-8297-76bfb9f58bb0" (UID: "c4678683-6339-4d33-8297-76bfb9f58bb0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.527524 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-config" (OuterVolumeSpecName: "config") pod "c4678683-6339-4d33-8297-76bfb9f58bb0" (UID: "c4678683-6339-4d33-8297-76bfb9f58bb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.533819 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c4678683-6339-4d33-8297-76bfb9f58bb0" (UID: "c4678683-6339-4d33-8297-76bfb9f58bb0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.539848 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frfnk\" (UniqueName: \"kubernetes.io/projected/c4678683-6339-4d33-8297-76bfb9f58bb0-kube-api-access-frfnk\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.539880 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.539889 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.539901 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.539911 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.539938 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4678683-6339-4d33-8297-76bfb9f58bb0" (UID: "c4678683-6339-4d33-8297-76bfb9f58bb0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.601725 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.649759 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4678683-6339-4d33-8297-76bfb9f58bb0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.681328 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f407712f-9d26-4102-a0c5-e7bfc61800ec","Type":"ContainerStarted","Data":"42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040"} Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.683250 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rdfkv" event={"ID":"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe","Type":"ContainerStarted","Data":"e94d54819e226679334dff47a83802012e4c5af1b9fc34261ff87281f68a047a"} Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.692580 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74662789-933b-40b7-8e5b-2a3e32ba08f4","Type":"ContainerStarted","Data":"615aca2ba944a15f2bc0f3a05573075fdb3afc2612a384c5ddc34da42368868b"} Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.704417 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-rdfkv" podStartSLOduration=3.245377537 podStartE2EDuration="12.70440237s" podCreationTimestamp="2025-10-06 07:01:01 +0000 UTC" firstStartedPulling="2025-10-06 07:01:03.07757049 +0000 UTC m=+947.592311498" lastFinishedPulling="2025-10-06 07:01:12.536595323 +0000 UTC m=+957.051336331" observedRunningTime="2025-10-06 07:01:13.703584019 +0000 UTC m=+958.218325037" watchObservedRunningTime="2025-10-06 07:01:13.70440237 +0000 UTC m=+958.219143378" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.743338 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01f8105a-29c8-4fef-806a-e1ed266242bf","Type":"ContainerStarted","Data":"5696d56e23da5efa00acfd5b22eb0a7490b3ff5ce5290737fa0d9bed7d2af015"} Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.747694 4845 generic.go:334] "Generic (PLEG): container finished" podID="c4678683-6339-4d33-8297-76bfb9f58bb0" containerID="9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d" exitCode=0 Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.747758 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" event={"ID":"c4678683-6339-4d33-8297-76bfb9f58bb0","Type":"ContainerDied","Data":"9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d"} Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.747777 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" event={"ID":"c4678683-6339-4d33-8297-76bfb9f58bb0","Type":"ContainerDied","Data":"e23cf7e2eadd910d7d270f9b6f87ff1232bab8879307ba39d171a527cb8c3065"} Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.747794 4845 scope.go:117] "RemoveContainer" containerID="9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.747816 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775fbffbc7-qzpjj" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.753697 4845 generic.go:334] "Generic (PLEG): container finished" podID="65f7cd79-f2bf-4927-9967-a6f5ccd42bdf" containerID="7b5f220d689497528f0edec5f867002f6f0feb8853739cf1a8f7f6508ee98200" exitCode=0 Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.753768 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ec2e-account-create-sxrxb" event={"ID":"65f7cd79-f2bf-4927-9967-a6f5ccd42bdf","Type":"ContainerDied","Data":"7b5f220d689497528f0edec5f867002f6f0feb8853739cf1a8f7f6508ee98200"} Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.757681 4845 generic.go:334] "Generic (PLEG): container finished" podID="7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27" containerID="2b0025c15fbc51b486fec70b1550e4d0228b888630bc10223ef1a029dd162cc3" exitCode=0 Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.757713 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a311-account-create-nqw7t" event={"ID":"7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27","Type":"ContainerDied","Data":"2b0025c15fbc51b486fec70b1550e4d0228b888630bc10223ef1a029dd162cc3"} Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.785192 4845 scope.go:117] "RemoveContainer" containerID="99599fb217d525d43f6dfb0f189c3ed7e1987529165266beb1e7cafedfaaaf6c" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.804071 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775fbffbc7-qzpjj"] Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.816694 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-775fbffbc7-qzpjj"] Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.876209 4845 scope.go:117] "RemoveContainer" containerID="9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d" Oct 06 07:01:13 crc kubenswrapper[4845]: E1006 07:01:13.878715 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d\": container with ID starting with 9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d not found: ID does not exist" containerID="9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.878747 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d"} err="failed to get container status \"9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d\": rpc error: code = NotFound desc = could not find container \"9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d\": container with ID starting with 9233cacba130e24e77381799a959ffd54d25950e806347f2c586d4541007849d not found: ID does not exist" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.878768 4845 scope.go:117] "RemoveContainer" containerID="99599fb217d525d43f6dfb0f189c3ed7e1987529165266beb1e7cafedfaaaf6c" Oct 06 07:01:13 crc kubenswrapper[4845]: E1006 07:01:13.880026 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99599fb217d525d43f6dfb0f189c3ed7e1987529165266beb1e7cafedfaaaf6c\": container with ID starting with 99599fb217d525d43f6dfb0f189c3ed7e1987529165266beb1e7cafedfaaaf6c not found: ID does not exist" containerID="99599fb217d525d43f6dfb0f189c3ed7e1987529165266beb1e7cafedfaaaf6c" Oct 06 07:01:13 crc kubenswrapper[4845]: I1006 07:01:13.880051 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99599fb217d525d43f6dfb0f189c3ed7e1987529165266beb1e7cafedfaaaf6c"} err="failed to get container status \"99599fb217d525d43f6dfb0f189c3ed7e1987529165266beb1e7cafedfaaaf6c\": rpc error: code = NotFound desc = could not find container \"99599fb217d525d43f6dfb0f189c3ed7e1987529165266beb1e7cafedfaaaf6c\": container with ID starting with 99599fb217d525d43f6dfb0f189c3ed7e1987529165266beb1e7cafedfaaaf6c not found: ID does not exist" Oct 06 07:01:14 crc kubenswrapper[4845]: I1006 07:01:14.098473 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rd54p"] Oct 06 07:01:14 crc kubenswrapper[4845]: I1006 07:01:14.240804 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d" path="/var/lib/kubelet/pods/8dbc75c6-6bad-4a56-a1a3-5f1d126fb56d/volumes" Oct 06 07:01:14 crc kubenswrapper[4845]: I1006 07:01:14.244876 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4678683-6339-4d33-8297-76bfb9f58bb0" path="/var/lib/kubelet/pods/c4678683-6339-4d33-8297-76bfb9f58bb0/volumes" Oct 06 07:01:14 crc kubenswrapper[4845]: I1006 07:01:14.771824 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rd54p" event={"ID":"b91871bf-75cd-40c0-aa78-a4fd53ff54dc","Type":"ContainerStarted","Data":"8c649eeb37b3414df39222dc7f3fe5c997214b53d0bca6d3794ab775e2eacabb"} Oct 06 07:01:14 crc kubenswrapper[4845]: I1006 07:01:14.772534 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rd54p" event={"ID":"b91871bf-75cd-40c0-aa78-a4fd53ff54dc","Type":"ContainerStarted","Data":"8707901264e2259a37b4bc8c59f5d7326fc2339965a8abfa4c705f3eb9323969"} Oct 06 07:01:14 crc kubenswrapper[4845]: I1006 07:01:14.773989 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f407712f-9d26-4102-a0c5-e7bfc61800ec","Type":"ContainerStarted","Data":"1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710"} Oct 06 07:01:14 crc kubenswrapper[4845]: I1006 07:01:14.781504 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74662789-933b-40b7-8e5b-2a3e32ba08f4","Type":"ContainerStarted","Data":"a3548ec736a947129ca3d614632af2fa0d86e4351c4a16563b2891f660f68b28"} Oct 06 07:01:14 crc kubenswrapper[4845]: I1006 07:01:14.810851 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rd54p" podStartSLOduration=1.810820878 podStartE2EDuration="1.810820878s" podCreationTimestamp="2025-10-06 07:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:14.794648332 +0000 UTC m=+959.309389340" watchObservedRunningTime="2025-10-06 07:01:14.810820878 +0000 UTC m=+959.325561886" Oct 06 07:01:14 crc kubenswrapper[4845]: I1006 07:01:14.828981 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.828944194 podStartE2EDuration="8.828944194s" podCreationTimestamp="2025-10-06 07:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:14.816708249 +0000 UTC m=+959.331449277" watchObservedRunningTime="2025-10-06 07:01:14.828944194 +0000 UTC m=+959.343685222" Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.290652 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ec2e-account-create-sxrxb" Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.297407 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a311-account-create-nqw7t" Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.381290 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj5ss\" (UniqueName: \"kubernetes.io/projected/65f7cd79-f2bf-4927-9967-a6f5ccd42bdf-kube-api-access-xj5ss\") pod \"65f7cd79-f2bf-4927-9967-a6f5ccd42bdf\" (UID: \"65f7cd79-f2bf-4927-9967-a6f5ccd42bdf\") " Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.381499 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbf68\" (UniqueName: \"kubernetes.io/projected/7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27-kube-api-access-rbf68\") pod \"7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27\" (UID: \"7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27\") " Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.387978 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27-kube-api-access-rbf68" (OuterVolumeSpecName: "kube-api-access-rbf68") pod "7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27" (UID: "7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27"). InnerVolumeSpecName "kube-api-access-rbf68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.399576 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f7cd79-f2bf-4927-9967-a6f5ccd42bdf-kube-api-access-xj5ss" (OuterVolumeSpecName: "kube-api-access-xj5ss") pod "65f7cd79-f2bf-4927-9967-a6f5ccd42bdf" (UID: "65f7cd79-f2bf-4927-9967-a6f5ccd42bdf"). InnerVolumeSpecName "kube-api-access-xj5ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.483802 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbf68\" (UniqueName: \"kubernetes.io/projected/7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27-kube-api-access-rbf68\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.483867 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj5ss\" (UniqueName: \"kubernetes.io/projected/65f7cd79-f2bf-4927-9967-a6f5ccd42bdf-kube-api-access-xj5ss\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.795537 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74662789-933b-40b7-8e5b-2a3e32ba08f4","Type":"ContainerStarted","Data":"e01e6a18db4f10dbbf8fa6de9f9b4a5f67a783a8541148211911eefe4e6dbe7d"} Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.801855 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01f8105a-29c8-4fef-806a-e1ed266242bf","Type":"ContainerStarted","Data":"56647ac1ce65e6636ec06d1d9263462cc93a281ac5686ad1707654b2dcab7ce4"} Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.803953 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ec2e-account-create-sxrxb" event={"ID":"65f7cd79-f2bf-4927-9967-a6f5ccd42bdf","Type":"ContainerDied","Data":"029cba906c989c4d856a403d30f29a3295fc78f44e2f2d499e3e71ab97cff6ed"} Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.804032 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="029cba906c989c4d856a403d30f29a3295fc78f44e2f2d499e3e71ab97cff6ed" Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.804112 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ec2e-account-create-sxrxb" Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.814425 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a311-account-create-nqw7t" event={"ID":"7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27","Type":"ContainerDied","Data":"844cb5a52fccbbd603e24bc7bc1da4e66d5298b631b01195bcc52c1060773a5a"} Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.814478 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="844cb5a52fccbbd603e24bc7bc1da4e66d5298b631b01195bcc52c1060773a5a" Oct 06 07:01:15 crc kubenswrapper[4845]: I1006 07:01:15.814452 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a311-account-create-nqw7t" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.329318 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.329301082 podStartE2EDuration="9.329301082s" podCreationTimestamp="2025-10-06 07:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:15.823388824 +0000 UTC m=+960.338129832" watchObservedRunningTime="2025-10-06 07:01:16.329301082 +0000 UTC m=+960.844042090" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.827123 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wpzxr"] Oct 06 07:01:16 crc kubenswrapper[4845]: E1006 07:01:16.827471 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27" containerName="mariadb-account-create" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.828427 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27" containerName="mariadb-account-create" Oct 06 07:01:16 crc kubenswrapper[4845]: E1006 07:01:16.828456 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f7cd79-f2bf-4927-9967-a6f5ccd42bdf" containerName="mariadb-account-create" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.828466 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f7cd79-f2bf-4927-9967-a6f5ccd42bdf" containerName="mariadb-account-create" Oct 06 07:01:16 crc kubenswrapper[4845]: E1006 07:01:16.828481 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4678683-6339-4d33-8297-76bfb9f58bb0" containerName="dnsmasq-dns" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.828488 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4678683-6339-4d33-8297-76bfb9f58bb0" containerName="dnsmasq-dns" Oct 06 07:01:16 crc kubenswrapper[4845]: E1006 07:01:16.828522 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4678683-6339-4d33-8297-76bfb9f58bb0" containerName="init" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.828533 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4678683-6339-4d33-8297-76bfb9f58bb0" containerName="init" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.828756 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27" containerName="mariadb-account-create" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.828777 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f7cd79-f2bf-4927-9967-a6f5ccd42bdf" containerName="mariadb-account-create" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.828797 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4678683-6339-4d33-8297-76bfb9f58bb0" containerName="dnsmasq-dns" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.834054 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.836505 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nqrqx" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.840240 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.840513 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rdfkv" event={"ID":"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe","Type":"ContainerDied","Data":"e94d54819e226679334dff47a83802012e4c5af1b9fc34261ff87281f68a047a"} Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.842920 4845 generic.go:334] "Generic (PLEG): container finished" podID="9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe" containerID="e94d54819e226679334dff47a83802012e4c5af1b9fc34261ff87281f68a047a" exitCode=0 Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.867180 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wpzxr"] Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.920854 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eecc14ae-0a55-4b21-9ecf-07b6c122e050-combined-ca-bundle\") pod \"barbican-db-sync-wpzxr\" (UID: \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\") " pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.920998 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eecc14ae-0a55-4b21-9ecf-07b6c122e050-db-sync-config-data\") pod \"barbican-db-sync-wpzxr\" (UID: \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\") " pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.921030 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7f5q\" (UniqueName: \"kubernetes.io/projected/eecc14ae-0a55-4b21-9ecf-07b6c122e050-kube-api-access-n7f5q\") pod \"barbican-db-sync-wpzxr\" (UID: \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\") " pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.922075 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-4wcpl"] Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.923385 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.926162 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.926839 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.927034 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mznpm" Oct 06 07:01:16 crc kubenswrapper[4845]: I1006 07:01:16.938752 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4wcpl"] Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.023319 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7f5q\" (UniqueName: \"kubernetes.io/projected/eecc14ae-0a55-4b21-9ecf-07b6c122e050-kube-api-access-n7f5q\") pod \"barbican-db-sync-wpzxr\" (UID: \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\") " pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.023423 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-config-data\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.023640 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eecc14ae-0a55-4b21-9ecf-07b6c122e050-combined-ca-bundle\") pod \"barbican-db-sync-wpzxr\" (UID: \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\") " pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.023996 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-scripts\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.024044 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-combined-ca-bundle\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.024136 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-db-sync-config-data\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.024214 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcffr\" (UniqueName: \"kubernetes.io/projected/8c8a9f74-3441-4e68-8014-08472abf6680-kube-api-access-bcffr\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.024419 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c8a9f74-3441-4e68-8014-08472abf6680-etc-machine-id\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.024448 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eecc14ae-0a55-4b21-9ecf-07b6c122e050-db-sync-config-data\") pod \"barbican-db-sync-wpzxr\" (UID: \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\") " pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.044037 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eecc14ae-0a55-4b21-9ecf-07b6c122e050-combined-ca-bundle\") pod \"barbican-db-sync-wpzxr\" (UID: \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\") " pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.044857 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eecc14ae-0a55-4b21-9ecf-07b6c122e050-db-sync-config-data\") pod \"barbican-db-sync-wpzxr\" (UID: \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\") " pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.055658 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7f5q\" (UniqueName: \"kubernetes.io/projected/eecc14ae-0a55-4b21-9ecf-07b6c122e050-kube-api-access-n7f5q\") pod \"barbican-db-sync-wpzxr\" (UID: \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\") " pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.126780 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c8a9f74-3441-4e68-8014-08472abf6680-etc-machine-id\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.126861 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-config-data\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.126878 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c8a9f74-3441-4e68-8014-08472abf6680-etc-machine-id\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.126908 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-scripts\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.126982 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-combined-ca-bundle\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.127029 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-db-sync-config-data\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.127144 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcffr\" (UniqueName: \"kubernetes.io/projected/8c8a9f74-3441-4e68-8014-08472abf6680-kube-api-access-bcffr\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.131520 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-combined-ca-bundle\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.131958 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-db-sync-config-data\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.132633 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-scripts\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.144865 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-config-data\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.147253 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcffr\" (UniqueName: \"kubernetes.io/projected/8c8a9f74-3441-4e68-8014-08472abf6680-kube-api-access-bcffr\") pod \"cinder-db-sync-4wcpl\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.160057 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.247857 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.247919 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.249363 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.292328 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.303149 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.862537 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 07:01:17 crc kubenswrapper[4845]: I1006 07:01:17.862653 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.249864 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.250324 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.292986 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.293062 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.491969 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.568636 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-scripts\") pod \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.568695 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-config-data\") pod \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.568738 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4jjd\" (UniqueName: \"kubernetes.io/projected/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-kube-api-access-b4jjd\") pod \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.568771 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-combined-ca-bundle\") pod \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.568824 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-logs\") pod \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\" (UID: \"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe\") " Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.569356 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-logs" (OuterVolumeSpecName: "logs") pod "9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe" (UID: "9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.574016 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-scripts" (OuterVolumeSpecName: "scripts") pod "9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe" (UID: "9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.580927 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-kube-api-access-b4jjd" (OuterVolumeSpecName: "kube-api-access-b4jjd") pod "9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe" (UID: "9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe"). InnerVolumeSpecName "kube-api-access-b4jjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.593567 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-config-data" (OuterVolumeSpecName: "config-data") pod "9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe" (UID: "9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.593604 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe" (UID: "9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.671510 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.671556 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.671574 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4jjd\" (UniqueName: \"kubernetes.io/projected/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-kube-api-access-b4jjd\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.671587 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.671599 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.870860 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rdfkv" event={"ID":"9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe","Type":"ContainerDied","Data":"db4fae6f2bce933fbc73ccbd9b427057439d44f19b45cd52b77b4259f07820ba"} Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.871194 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db4fae6f2bce933fbc73ccbd9b427057439d44f19b45cd52b77b4259f07820ba" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.870900 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rdfkv" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.872140 4845 generic.go:334] "Generic (PLEG): container finished" podID="b91871bf-75cd-40c0-aa78-a4fd53ff54dc" containerID="8c649eeb37b3414df39222dc7f3fe5c997214b53d0bca6d3794ab775e2eacabb" exitCode=0 Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.872197 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rd54p" event={"ID":"b91871bf-75cd-40c0-aa78-a4fd53ff54dc","Type":"ContainerDied","Data":"8c649eeb37b3414df39222dc7f3fe5c997214b53d0bca6d3794ab775e2eacabb"} Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.872449 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.872581 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.962808 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8469844cbb-s9qws"] Oct 06 07:01:18 crc kubenswrapper[4845]: E1006 07:01:18.963160 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe" containerName="placement-db-sync" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.963175 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe" containerName="placement-db-sync" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.963361 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe" containerName="placement-db-sync" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.966493 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.968179 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.969307 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.969610 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.969718 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.969916 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qj9gj" Oct 06 07:01:18 crc kubenswrapper[4845]: I1006 07:01:18.984796 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8469844cbb-s9qws"] Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.077122 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-internal-tls-certs\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.077168 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-scripts\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.077194 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-public-tls-certs\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.077231 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0921c6da-fb36-4acf-b978-252f370ccc30-logs\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.077254 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-combined-ca-bundle\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.077281 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6htjn\" (UniqueName: \"kubernetes.io/projected/0921c6da-fb36-4acf-b978-252f370ccc30-kube-api-access-6htjn\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.077308 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-config-data\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: E1006 07:01:19.117185 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f5c42ed_45a5_40fe_8e8e_aaa105c2ffbe.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f5c42ed_45a5_40fe_8e8e_aaa105c2ffbe.slice/crio-db4fae6f2bce933fbc73ccbd9b427057439d44f19b45cd52b77b4259f07820ba\": RecentStats: unable to find data in memory cache]" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.177695 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-scripts\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.177754 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-public-tls-certs\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.177792 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0921c6da-fb36-4acf-b978-252f370ccc30-logs\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.177813 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-combined-ca-bundle\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.177844 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6htjn\" (UniqueName: \"kubernetes.io/projected/0921c6da-fb36-4acf-b978-252f370ccc30-kube-api-access-6htjn\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.177872 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-config-data\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.177935 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-internal-tls-certs\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.178626 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0921c6da-fb36-4acf-b978-252f370ccc30-logs\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.182463 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-public-tls-certs\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.182638 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-internal-tls-certs\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.184760 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-combined-ca-bundle\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.185214 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-scripts\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.185697 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0921c6da-fb36-4acf-b978-252f370ccc30-config-data\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.195167 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6htjn\" (UniqueName: \"kubernetes.io/projected/0921c6da-fb36-4acf-b978-252f370ccc30-kube-api-access-6htjn\") pod \"placement-8469844cbb-s9qws\" (UID: \"0921c6da-fb36-4acf-b978-252f370ccc30\") " pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:19 crc kubenswrapper[4845]: I1006 07:01:19.283245 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.007143 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.037819 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.412347 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.541181 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdtjm\" (UniqueName: \"kubernetes.io/projected/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-kube-api-access-kdtjm\") pod \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.541258 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-fernet-keys\") pod \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.541451 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-config-data\") pod \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.541521 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-credential-keys\") pod \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.541623 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-combined-ca-bundle\") pod \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.541662 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-scripts\") pod \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\" (UID: \"b91871bf-75cd-40c0-aa78-a4fd53ff54dc\") " Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.567354 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b91871bf-75cd-40c0-aa78-a4fd53ff54dc" (UID: "b91871bf-75cd-40c0-aa78-a4fd53ff54dc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.571498 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b91871bf-75cd-40c0-aa78-a4fd53ff54dc" (UID: "b91871bf-75cd-40c0-aa78-a4fd53ff54dc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.567389 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-kube-api-access-kdtjm" (OuterVolumeSpecName: "kube-api-access-kdtjm") pod "b91871bf-75cd-40c0-aa78-a4fd53ff54dc" (UID: "b91871bf-75cd-40c0-aa78-a4fd53ff54dc"). InnerVolumeSpecName "kube-api-access-kdtjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.582337 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b91871bf-75cd-40c0-aa78-a4fd53ff54dc" (UID: "b91871bf-75cd-40c0-aa78-a4fd53ff54dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.583798 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-scripts" (OuterVolumeSpecName: "scripts") pod "b91871bf-75cd-40c0-aa78-a4fd53ff54dc" (UID: "b91871bf-75cd-40c0-aa78-a4fd53ff54dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.600880 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-config-data" (OuterVolumeSpecName: "config-data") pod "b91871bf-75cd-40c0-aa78-a4fd53ff54dc" (UID: "b91871bf-75cd-40c0-aa78-a4fd53ff54dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.644228 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.644261 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdtjm\" (UniqueName: \"kubernetes.io/projected/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-kube-api-access-kdtjm\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.644272 4845 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.644281 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.644289 4845 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.644297 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91871bf-75cd-40c0-aa78-a4fd53ff54dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.865616 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8469844cbb-s9qws"] Oct 06 07:01:21 crc kubenswrapper[4845]: W1006 07:01:21.867213 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0921c6da_fb36_4acf_b978_252f370ccc30.slice/crio-4bf6ff86ea79349ae287b06531adad8b87de5c5f3bd04bffa8eeee4f2564c2f3 WatchSource:0}: Error finding container 4bf6ff86ea79349ae287b06531adad8b87de5c5f3bd04bffa8eeee4f2564c2f3: Status 404 returned error can't find the container with id 4bf6ff86ea79349ae287b06531adad8b87de5c5f3bd04bffa8eeee4f2564c2f3 Oct 06 07:01:21 crc kubenswrapper[4845]: W1006 07:01:21.874291 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c8a9f74_3441_4e68_8014_08472abf6680.slice/crio-94334d48dee581d4b37fe73ef46048545a540b25769c18388bcda2deb979e81e WatchSource:0}: Error finding container 94334d48dee581d4b37fe73ef46048545a540b25769c18388bcda2deb979e81e: Status 404 returned error can't find the container with id 94334d48dee581d4b37fe73ef46048545a540b25769c18388bcda2deb979e81e Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.875161 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4wcpl"] Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.898668 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rd54p" event={"ID":"b91871bf-75cd-40c0-aa78-a4fd53ff54dc","Type":"ContainerDied","Data":"8707901264e2259a37b4bc8c59f5d7326fc2339965a8abfa4c705f3eb9323969"} Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.898705 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rd54p" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.898709 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8707901264e2259a37b4bc8c59f5d7326fc2339965a8abfa4c705f3eb9323969" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.900262 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4wcpl" event={"ID":"8c8a9f74-3441-4e68-8014-08472abf6680","Type":"ContainerStarted","Data":"94334d48dee581d4b37fe73ef46048545a540b25769c18388bcda2deb979e81e"} Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.901509 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8469844cbb-s9qws" event={"ID":"0921c6da-fb36-4acf-b978-252f370ccc30","Type":"ContainerStarted","Data":"4bf6ff86ea79349ae287b06531adad8b87de5c5f3bd04bffa8eeee4f2564c2f3"} Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.931426 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.939212 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 07:01:21 crc kubenswrapper[4845]: I1006 07:01:21.963269 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wpzxr"] Oct 06 07:01:22 crc kubenswrapper[4845]: W1006 07:01:22.088951 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeecc14ae_0a55_4b21_9ecf_07b6c122e050.slice/crio-92d3ee3556335db20a55d8dcc0c0705b11615f52817e1770210545a64f53af98 WatchSource:0}: Error finding container 92d3ee3556335db20a55d8dcc0c0705b11615f52817e1770210545a64f53af98: Status 404 returned error can't find the container with id 92d3ee3556335db20a55d8dcc0c0705b11615f52817e1770210545a64f53af98 Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.590883 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-67cb66d46f-6rxvh"] Oct 06 07:01:22 crc kubenswrapper[4845]: E1006 07:01:22.591367 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91871bf-75cd-40c0-aa78-a4fd53ff54dc" containerName="keystone-bootstrap" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.591399 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91871bf-75cd-40c0-aa78-a4fd53ff54dc" containerName="keystone-bootstrap" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.591668 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="b91871bf-75cd-40c0-aa78-a4fd53ff54dc" containerName="keystone-bootstrap" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.592388 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.595453 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xw6wh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.595531 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.595773 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.596312 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.597293 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.598189 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.600228 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-67cb66d46f-6rxvh"] Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.678871 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q97pj\" (UniqueName: \"kubernetes.io/projected/6427f38b-494b-4cd7-b019-aa8db716ffe0-kube-api-access-q97pj\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.679507 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-config-data\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.680338 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-fernet-keys\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.680401 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-public-tls-certs\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.680440 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-scripts\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.680457 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-internal-tls-certs\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.680510 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-credential-keys\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.680533 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-combined-ca-bundle\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.784226 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q97pj\" (UniqueName: \"kubernetes.io/projected/6427f38b-494b-4cd7-b019-aa8db716ffe0-kube-api-access-q97pj\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.784287 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-config-data\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.784334 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-fernet-keys\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.784421 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-public-tls-certs\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.784447 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-scripts\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.784464 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-internal-tls-certs\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.784506 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-credential-keys\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.784526 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-combined-ca-bundle\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.796018 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-credential-keys\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.796111 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-public-tls-certs\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.796845 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-internal-tls-certs\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.798516 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-fernet-keys\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.802852 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-combined-ca-bundle\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.807513 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-scripts\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.807692 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q97pj\" (UniqueName: \"kubernetes.io/projected/6427f38b-494b-4cd7-b019-aa8db716ffe0-kube-api-access-q97pj\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.815832 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6427f38b-494b-4cd7-b019-aa8db716ffe0-config-data\") pod \"keystone-67cb66d46f-6rxvh\" (UID: \"6427f38b-494b-4cd7-b019-aa8db716ffe0\") " pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.913606 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wpzxr" event={"ID":"eecc14ae-0a55-4b21-9ecf-07b6c122e050","Type":"ContainerStarted","Data":"92d3ee3556335db20a55d8dcc0c0705b11615f52817e1770210545a64f53af98"} Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.916954 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01f8105a-29c8-4fef-806a-e1ed266242bf","Type":"ContainerStarted","Data":"8a0e94c70de0df0ba83c97f7dbd701bb277d2194c426c5f900dbd4bcef3dfae1"} Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.919955 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8469844cbb-s9qws" event={"ID":"0921c6da-fb36-4acf-b978-252f370ccc30","Type":"ContainerStarted","Data":"4f15be02c2ead7db937814eee7df8949f8449a75fdbb39fd9c70fce2e9d61671"} Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.920097 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.920185 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8469844cbb-s9qws" event={"ID":"0921c6da-fb36-4acf-b978-252f370ccc30","Type":"ContainerStarted","Data":"80de6d38f4aebebe516821c76c7f9b8b1a4559c3ef9b21ffa3eda0076846a143"} Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.920294 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.946465 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:22 crc kubenswrapper[4845]: I1006 07:01:22.953131 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8469844cbb-s9qws" podStartSLOduration=4.953101845 podStartE2EDuration="4.953101845s" podCreationTimestamp="2025-10-06 07:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:22.943386225 +0000 UTC m=+967.458127243" watchObservedRunningTime="2025-10-06 07:01:22.953101845 +0000 UTC m=+967.467842853" Oct 06 07:01:23 crc kubenswrapper[4845]: I1006 07:01:23.434128 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-67cb66d46f-6rxvh"] Oct 06 07:01:23 crc kubenswrapper[4845]: I1006 07:01:23.930530 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67cb66d46f-6rxvh" event={"ID":"6427f38b-494b-4cd7-b019-aa8db716ffe0","Type":"ContainerStarted","Data":"31a3853376948bb9e6c99c7cf7422a6d6ffa870e5b6e9fef8c4608fda51e0af8"} Oct 06 07:01:23 crc kubenswrapper[4845]: I1006 07:01:23.930605 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67cb66d46f-6rxvh" event={"ID":"6427f38b-494b-4cd7-b019-aa8db716ffe0","Type":"ContainerStarted","Data":"7392a145e8a94e69202fe7ddcbd7bd169bd8ba99a5975e399d5d6904da156132"} Oct 06 07:01:23 crc kubenswrapper[4845]: I1006 07:01:23.957043 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-67cb66d46f-6rxvh" podStartSLOduration=1.956987738 podStartE2EDuration="1.956987738s" podCreationTimestamp="2025-10-06 07:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:23.950159682 +0000 UTC m=+968.464900700" watchObservedRunningTime="2025-10-06 07:01:23.956987738 +0000 UTC m=+968.471728746" Oct 06 07:01:24 crc kubenswrapper[4845]: I1006 07:01:24.938901 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:29 crc kubenswrapper[4845]: I1006 07:01:29.985917 4845 generic.go:334] "Generic (PLEG): container finished" podID="00ff2f10-a7a1-458b-8a67-78879221e169" containerID="58caf3b6149fbdb8593cc52903e01819229bfcd094c10597ff05d55cb3e14416" exitCode=0 Oct 06 07:01:29 crc kubenswrapper[4845]: I1006 07:01:29.986017 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rpmdt" event={"ID":"00ff2f10-a7a1-458b-8a67-78879221e169","Type":"ContainerDied","Data":"58caf3b6149fbdb8593cc52903e01819229bfcd094c10597ff05d55cb3e14416"} Oct 06 07:01:32 crc kubenswrapper[4845]: I1006 07:01:32.003257 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rpmdt" event={"ID":"00ff2f10-a7a1-458b-8a67-78879221e169","Type":"ContainerDied","Data":"39ceb27c5bf83a80ffbd50f6782e4943ab7718f0ca44f87c6bc97265531dace2"} Oct 06 07:01:32 crc kubenswrapper[4845]: I1006 07:01:32.003833 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39ceb27c5bf83a80ffbd50f6782e4943ab7718f0ca44f87c6bc97265531dace2" Oct 06 07:01:32 crc kubenswrapper[4845]: I1006 07:01:32.019547 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:32 crc kubenswrapper[4845]: I1006 07:01:32.151083 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ff2f10-a7a1-458b-8a67-78879221e169-config\") pod \"00ff2f10-a7a1-458b-8a67-78879221e169\" (UID: \"00ff2f10-a7a1-458b-8a67-78879221e169\") " Oct 06 07:01:32 crc kubenswrapper[4845]: I1006 07:01:32.151232 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ff2f10-a7a1-458b-8a67-78879221e169-combined-ca-bundle\") pod \"00ff2f10-a7a1-458b-8a67-78879221e169\" (UID: \"00ff2f10-a7a1-458b-8a67-78879221e169\") " Oct 06 07:01:32 crc kubenswrapper[4845]: I1006 07:01:32.151267 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdv25\" (UniqueName: \"kubernetes.io/projected/00ff2f10-a7a1-458b-8a67-78879221e169-kube-api-access-zdv25\") pod \"00ff2f10-a7a1-458b-8a67-78879221e169\" (UID: \"00ff2f10-a7a1-458b-8a67-78879221e169\") " Oct 06 07:01:32 crc kubenswrapper[4845]: I1006 07:01:32.155671 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ff2f10-a7a1-458b-8a67-78879221e169-kube-api-access-zdv25" (OuterVolumeSpecName: "kube-api-access-zdv25") pod "00ff2f10-a7a1-458b-8a67-78879221e169" (UID: "00ff2f10-a7a1-458b-8a67-78879221e169"). InnerVolumeSpecName "kube-api-access-zdv25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:32 crc kubenswrapper[4845]: I1006 07:01:32.176684 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ff2f10-a7a1-458b-8a67-78879221e169-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00ff2f10-a7a1-458b-8a67-78879221e169" (UID: "00ff2f10-a7a1-458b-8a67-78879221e169"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:32 crc kubenswrapper[4845]: I1006 07:01:32.182011 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ff2f10-a7a1-458b-8a67-78879221e169-config" (OuterVolumeSpecName: "config") pod "00ff2f10-a7a1-458b-8a67-78879221e169" (UID: "00ff2f10-a7a1-458b-8a67-78879221e169"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:32 crc kubenswrapper[4845]: I1006 07:01:32.253116 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ff2f10-a7a1-458b-8a67-78879221e169-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:32 crc kubenswrapper[4845]: I1006 07:01:32.253161 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdv25\" (UniqueName: \"kubernetes.io/projected/00ff2f10-a7a1-458b-8a67-78879221e169-kube-api-access-zdv25\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:32 crc kubenswrapper[4845]: I1006 07:01:32.253175 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ff2f10-a7a1-458b-8a67-78879221e169-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.011953 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rpmdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.011989 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wpzxr" event={"ID":"eecc14ae-0a55-4b21-9ecf-07b6c122e050","Type":"ContainerStarted","Data":"f933d86f47f22e4a63f9e72dcd42428743559269c4e8cbd4ca7be74171922570"} Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.029788 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wpzxr" podStartSLOduration=7.251033124 podStartE2EDuration="17.029767529s" podCreationTimestamp="2025-10-06 07:01:16 +0000 UTC" firstStartedPulling="2025-10-06 07:01:22.100089902 +0000 UTC m=+966.614830910" lastFinishedPulling="2025-10-06 07:01:31.878824307 +0000 UTC m=+976.393565315" observedRunningTime="2025-10-06 07:01:33.026473365 +0000 UTC m=+977.541214363" watchObservedRunningTime="2025-10-06 07:01:33.029767529 +0000 UTC m=+977.544508547" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.179622 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79f8b69b5-x65p4"] Oct 06 07:01:33 crc kubenswrapper[4845]: E1006 07:01:33.179975 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ff2f10-a7a1-458b-8a67-78879221e169" containerName="neutron-db-sync" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.179991 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ff2f10-a7a1-458b-8a67-78879221e169" containerName="neutron-db-sync" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.180153 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ff2f10-a7a1-458b-8a67-78879221e169" containerName="neutron-db-sync" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.181365 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.198577 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f8b69b5-x65p4"] Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.269156 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgmcr\" (UniqueName: \"kubernetes.io/projected/2e1193ce-a230-46ae-bef2-747405888b97-kube-api-access-xgmcr\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.269218 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-ovsdbserver-nb\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.269304 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-config\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.269351 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-dns-svc\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.269505 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-dns-swift-storage-0\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.269536 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-ovsdbserver-sb\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.331950 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5574b86b4d-dzxdt"] Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.335515 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.349481 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.349546 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.352216 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.352316 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-65vsq" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.374263 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-ovsdbserver-nb\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.377226 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5574b86b4d-dzxdt"] Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.378409 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-ovsdbserver-nb\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.378664 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-config\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.378744 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-dns-svc\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.378873 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-dns-swift-storage-0\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.378946 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-ovsdbserver-sb\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.379193 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgmcr\" (UniqueName: \"kubernetes.io/projected/2e1193ce-a230-46ae-bef2-747405888b97-kube-api-access-xgmcr\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.381560 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-config\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.382361 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-ovsdbserver-sb\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.383203 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-dns-swift-storage-0\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.383539 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-dns-svc\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.413724 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgmcr\" (UniqueName: \"kubernetes.io/projected/2e1193ce-a230-46ae-bef2-747405888b97-kube-api-access-xgmcr\") pod \"dnsmasq-dns-79f8b69b5-x65p4\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.480221 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcxvl\" (UniqueName: \"kubernetes.io/projected/7e6e1657-fd8e-4dd4-b3be-166870ab020a-kube-api-access-hcxvl\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.480296 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-httpd-config\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.480504 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-config\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.480546 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-ovndb-tls-certs\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.480562 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-combined-ca-bundle\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.500399 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.582271 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-config\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.582657 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-ovndb-tls-certs\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.582688 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-combined-ca-bundle\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.582729 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcxvl\" (UniqueName: \"kubernetes.io/projected/7e6e1657-fd8e-4dd4-b3be-166870ab020a-kube-api-access-hcxvl\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.582768 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-httpd-config\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.586870 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-config\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.587332 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-httpd-config\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.590351 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-combined-ca-bundle\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.591170 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-ovndb-tls-certs\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.599940 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcxvl\" (UniqueName: \"kubernetes.io/projected/7e6e1657-fd8e-4dd4-b3be-166870ab020a-kube-api-access-hcxvl\") pod \"neutron-5574b86b4d-dzxdt\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: I1006 07:01:33.665807 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:33 crc kubenswrapper[4845]: E1006 07:01:33.723845 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading blob sha256:859785206f5760646f83984ae71a45c7ca6aa1029ff438169b0956e3f3a6ef4c: fetching blob: received unexpected HTTP status: 504 Gateway Timeout" image="registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48" Oct 06 07:01:33 crc kubenswrapper[4845]: E1006 07:01:33.724058 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gk9qf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(01f8105a-29c8-4fef-806a-e1ed266242bf): ErrImagePull: copying system image from manifest list: reading blob sha256:859785206f5760646f83984ae71a45c7ca6aa1029ff438169b0956e3f3a6ef4c: fetching blob: received unexpected HTTP status: 504 Gateway Timeout" logger="UnhandledError" Oct 06 07:01:33 crc kubenswrapper[4845]: E1006 07:01:33.725439 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"copying system image from manifest list: reading blob sha256:859785206f5760646f83984ae71a45c7ca6aa1029ff438169b0956e3f3a6ef4c: fetching blob: received unexpected HTTP status: 504 Gateway Timeout\"" pod="openstack/ceilometer-0" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" Oct 06 07:01:34 crc kubenswrapper[4845]: I1006 07:01:34.024257 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerName="ceilometer-central-agent" containerID="cri-o://5696d56e23da5efa00acfd5b22eb0a7490b3ff5ce5290737fa0d9bed7d2af015" gracePeriod=30 Oct 06 07:01:34 crc kubenswrapper[4845]: I1006 07:01:34.024350 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerName="ceilometer-notification-agent" containerID="cri-o://56647ac1ce65e6636ec06d1d9263462cc93a281ac5686ad1707654b2dcab7ce4" gracePeriod=30 Oct 06 07:01:34 crc kubenswrapper[4845]: I1006 07:01:34.024339 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerName="sg-core" containerID="cri-o://8a0e94c70de0df0ba83c97f7dbd701bb277d2194c426c5f900dbd4bcef3dfae1" gracePeriod=30 Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.035107 4845 generic.go:334] "Generic (PLEG): container finished" podID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerID="8a0e94c70de0df0ba83c97f7dbd701bb277d2194c426c5f900dbd4bcef3dfae1" exitCode=2 Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.035476 4845 generic.go:334] "Generic (PLEG): container finished" podID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerID="5696d56e23da5efa00acfd5b22eb0a7490b3ff5ce5290737fa0d9bed7d2af015" exitCode=0 Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.035208 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01f8105a-29c8-4fef-806a-e1ed266242bf","Type":"ContainerDied","Data":"8a0e94c70de0df0ba83c97f7dbd701bb277d2194c426c5f900dbd4bcef3dfae1"} Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.035558 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01f8105a-29c8-4fef-806a-e1ed266242bf","Type":"ContainerDied","Data":"5696d56e23da5efa00acfd5b22eb0a7490b3ff5ce5290737fa0d9bed7d2af015"} Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.038169 4845 generic.go:334] "Generic (PLEG): container finished" podID="eecc14ae-0a55-4b21-9ecf-07b6c122e050" containerID="f933d86f47f22e4a63f9e72dcd42428743559269c4e8cbd4ca7be74171922570" exitCode=0 Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.038239 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wpzxr" event={"ID":"eecc14ae-0a55-4b21-9ecf-07b6c122e050","Type":"ContainerDied","Data":"f933d86f47f22e4a63f9e72dcd42428743559269c4e8cbd4ca7be74171922570"} Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.527389 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-767548595f-nndsw"] Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.529363 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.531733 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.531920 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.541153 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-767548595f-nndsw"] Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.622393 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-httpd-config\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.622540 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-config\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.622652 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-internal-tls-certs\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.622721 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-combined-ca-bundle\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.622761 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2gld\" (UniqueName: \"kubernetes.io/projected/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-kube-api-access-m2gld\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.622945 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-ovndb-tls-certs\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.623002 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-public-tls-certs\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.724940 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-httpd-config\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.725018 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-config\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.725051 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-internal-tls-certs\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.725075 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-combined-ca-bundle\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.725093 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2gld\" (UniqueName: \"kubernetes.io/projected/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-kube-api-access-m2gld\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.725122 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-ovndb-tls-certs\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.725138 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-public-tls-certs\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.731125 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-ovndb-tls-certs\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.731993 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-httpd-config\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.734160 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-combined-ca-bundle\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.737895 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-internal-tls-certs\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.739867 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-config\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.740673 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2gld\" (UniqueName: \"kubernetes.io/projected/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-kube-api-access-m2gld\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.741182 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f554586f-3f7f-4fe0-9a1b-0ff75662c2e2-public-tls-certs\") pod \"neutron-767548595f-nndsw\" (UID: \"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2\") " pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:35 crc kubenswrapper[4845]: I1006 07:01:35.850635 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:39 crc kubenswrapper[4845]: E1006 07:01:39.623543 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01f8105a_29c8_4fef_806a_e1ed266242bf.slice/crio-conmon-56647ac1ce65e6636ec06d1d9263462cc93a281ac5686ad1707654b2dcab7ce4.scope\": RecentStats: unable to find data in memory cache]" Oct 06 07:01:40 crc kubenswrapper[4845]: I1006 07:01:40.084047 4845 generic.go:334] "Generic (PLEG): container finished" podID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerID="56647ac1ce65e6636ec06d1d9263462cc93a281ac5686ad1707654b2dcab7ce4" exitCode=0 Oct 06 07:01:40 crc kubenswrapper[4845]: I1006 07:01:40.084134 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01f8105a-29c8-4fef-806a-e1ed266242bf","Type":"ContainerDied","Data":"56647ac1ce65e6636ec06d1d9263462cc93a281ac5686ad1707654b2dcab7ce4"} Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.066663 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.105924 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wpzxr" event={"ID":"eecc14ae-0a55-4b21-9ecf-07b6c122e050","Type":"ContainerDied","Data":"92d3ee3556335db20a55d8dcc0c0705b11615f52817e1770210545a64f53af98"} Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.105966 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92d3ee3556335db20a55d8dcc0c0705b11615f52817e1770210545a64f53af98" Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.106016 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wpzxr" Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.255360 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7f5q\" (UniqueName: \"kubernetes.io/projected/eecc14ae-0a55-4b21-9ecf-07b6c122e050-kube-api-access-n7f5q\") pod \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\" (UID: \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\") " Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.255507 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eecc14ae-0a55-4b21-9ecf-07b6c122e050-combined-ca-bundle\") pod \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\" (UID: \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\") " Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.255663 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eecc14ae-0a55-4b21-9ecf-07b6c122e050-db-sync-config-data\") pod \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\" (UID: \"eecc14ae-0a55-4b21-9ecf-07b6c122e050\") " Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.270671 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eecc14ae-0a55-4b21-9ecf-07b6c122e050-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "eecc14ae-0a55-4b21-9ecf-07b6c122e050" (UID: "eecc14ae-0a55-4b21-9ecf-07b6c122e050"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.270870 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eecc14ae-0a55-4b21-9ecf-07b6c122e050-kube-api-access-n7f5q" (OuterVolumeSpecName: "kube-api-access-n7f5q") pod "eecc14ae-0a55-4b21-9ecf-07b6c122e050" (UID: "eecc14ae-0a55-4b21-9ecf-07b6c122e050"). InnerVolumeSpecName "kube-api-access-n7f5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.298107 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eecc14ae-0a55-4b21-9ecf-07b6c122e050-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eecc14ae-0a55-4b21-9ecf-07b6c122e050" (UID: "eecc14ae-0a55-4b21-9ecf-07b6c122e050"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.358355 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eecc14ae-0a55-4b21-9ecf-07b6c122e050-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.358414 4845 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eecc14ae-0a55-4b21-9ecf-07b6c122e050-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:41 crc kubenswrapper[4845]: I1006 07:01:41.358424 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7f5q\" (UniqueName: \"kubernetes.io/projected/eecc14ae-0a55-4b21-9ecf-07b6c122e050-kube-api-access-n7f5q\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:42 crc kubenswrapper[4845]: E1006 07:01:42.354509 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:b78cfc68a577b1553523c8a70a34e297" Oct 06 07:01:42 crc kubenswrapper[4845]: E1006 07:01:42.354850 4845 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:b78cfc68a577b1553523c8a70a34e297" Oct 06 07:01:42 crc kubenswrapper[4845]: E1006 07:01:42.354977 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:b78cfc68a577b1553523c8a70a34e297,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bcffr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-4wcpl_openstack(8c8a9f74-3441-4e68-8014-08472abf6680): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.355449 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7d85949b65-6bf6r"] Oct 06 07:01:42 crc kubenswrapper[4845]: E1006 07:01:42.355899 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eecc14ae-0a55-4b21-9ecf-07b6c122e050" containerName="barbican-db-sync" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.355918 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="eecc14ae-0a55-4b21-9ecf-07b6c122e050" containerName="barbican-db-sync" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.356141 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="eecc14ae-0a55-4b21-9ecf-07b6c122e050" containerName="barbican-db-sync" Oct 06 07:01:42 crc kubenswrapper[4845]: E1006 07:01:42.356544 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-4wcpl" podUID="8c8a9f74-3441-4e68-8014-08472abf6680" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.370681 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-c5976ff76-gcfvw"] Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.377111 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.381551 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.404270 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.404526 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.404628 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.404744 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nqrqx" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.418627 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d85949b65-6bf6r"] Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.422703 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c5976ff76-gcfvw"] Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.481657 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b247c0d-1911-47a7-83bc-fad6ee6d6172-combined-ca-bundle\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.481744 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8cd370d-1327-4f32-a12d-e43c99f63f23-config-data\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.481774 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrzrn\" (UniqueName: \"kubernetes.io/projected/a8cd370d-1327-4f32-a12d-e43c99f63f23-kube-api-access-nrzrn\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.481806 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8cd370d-1327-4f32-a12d-e43c99f63f23-combined-ca-bundle\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.481863 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b247c0d-1911-47a7-83bc-fad6ee6d6172-config-data-custom\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.481913 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b247c0d-1911-47a7-83bc-fad6ee6d6172-config-data\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.481932 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdk87\" (UniqueName: \"kubernetes.io/projected/3b247c0d-1911-47a7-83bc-fad6ee6d6172-kube-api-access-kdk87\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.481979 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b247c0d-1911-47a7-83bc-fad6ee6d6172-logs\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.482008 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8cd370d-1327-4f32-a12d-e43c99f63f23-logs\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.482034 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8cd370d-1327-4f32-a12d-e43c99f63f23-config-data-custom\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.536595 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f8b69b5-x65p4"] Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.595183 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b247c0d-1911-47a7-83bc-fad6ee6d6172-config-data\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.595577 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdk87\" (UniqueName: \"kubernetes.io/projected/3b247c0d-1911-47a7-83bc-fad6ee6d6172-kube-api-access-kdk87\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.595613 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b247c0d-1911-47a7-83bc-fad6ee6d6172-logs\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.595638 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8cd370d-1327-4f32-a12d-e43c99f63f23-logs\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.595666 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8cd370d-1327-4f32-a12d-e43c99f63f23-config-data-custom\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.595688 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b247c0d-1911-47a7-83bc-fad6ee6d6172-combined-ca-bundle\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.595740 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8cd370d-1327-4f32-a12d-e43c99f63f23-config-data\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.595761 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrzrn\" (UniqueName: \"kubernetes.io/projected/a8cd370d-1327-4f32-a12d-e43c99f63f23-kube-api-access-nrzrn\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.595791 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8cd370d-1327-4f32-a12d-e43c99f63f23-combined-ca-bundle\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.595812 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b247c0d-1911-47a7-83bc-fad6ee6d6172-config-data-custom\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.597685 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8cd370d-1327-4f32-a12d-e43c99f63f23-logs\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.622872 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b247c0d-1911-47a7-83bc-fad6ee6d6172-logs\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.666462 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d85d699f7-d86dl"] Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.668296 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.674429 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d85d699f7-d86dl"] Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.700105 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8cd370d-1327-4f32-a12d-e43c99f63f23-config-data\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.708207 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6c7499b5fd-mcb8g"] Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.709667 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.716677 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdk87\" (UniqueName: \"kubernetes.io/projected/3b247c0d-1911-47a7-83bc-fad6ee6d6172-kube-api-access-kdk87\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.717188 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.719738 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8cd370d-1327-4f32-a12d-e43c99f63f23-combined-ca-bundle\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.719770 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8cd370d-1327-4f32-a12d-e43c99f63f23-config-data-custom\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.724697 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b247c0d-1911-47a7-83bc-fad6ee6d6172-config-data\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.726108 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b247c0d-1911-47a7-83bc-fad6ee6d6172-config-data-custom\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.726462 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b247c0d-1911-47a7-83bc-fad6ee6d6172-combined-ca-bundle\") pod \"barbican-keystone-listener-c5976ff76-gcfvw\" (UID: \"3b247c0d-1911-47a7-83bc-fad6ee6d6172\") " pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.729820 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrzrn\" (UniqueName: \"kubernetes.io/projected/a8cd370d-1327-4f32-a12d-e43c99f63f23-kube-api-access-nrzrn\") pod \"barbican-worker-7d85949b65-6bf6r\" (UID: \"a8cd370d-1327-4f32-a12d-e43c99f63f23\") " pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.744327 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c7499b5fd-mcb8g"] Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.802791 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-ovsdbserver-nb\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.803674 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-config\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.803804 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-ovsdbserver-sb\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.803905 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55dnh\" (UniqueName: \"kubernetes.io/projected/3c0f4b65-1983-440e-a63a-e239797757eb-kube-api-access-55dnh\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.804017 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-dns-svc\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.804134 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-dns-swift-storage-0\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.807878 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.817723 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d85949b65-6bf6r" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.905715 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622dd478-18fc-4c49-8019-1f12bf96e43e-logs\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.905945 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-dns-swift-storage-0\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.906059 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-config-data-custom\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.906193 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-ovsdbserver-nb\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.906302 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-config\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.906503 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-config-data\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.907304 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-dns-swift-storage-0\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.907297 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-config\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.907514 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-ovsdbserver-sb\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.907600 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55dnh\" (UniqueName: \"kubernetes.io/projected/3c0f4b65-1983-440e-a63a-e239797757eb-kube-api-access-55dnh\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.907670 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk68g\" (UniqueName: \"kubernetes.io/projected/622dd478-18fc-4c49-8019-1f12bf96e43e-kube-api-access-wk68g\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.907747 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-combined-ca-bundle\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.907831 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-dns-svc\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.908315 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-ovsdbserver-nb\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.908621 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-dns-svc\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.909176 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-ovsdbserver-sb\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:42 crc kubenswrapper[4845]: I1006 07:01:42.930844 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55dnh\" (UniqueName: \"kubernetes.io/projected/3c0f4b65-1983-440e-a63a-e239797757eb-kube-api-access-55dnh\") pod \"dnsmasq-dns-7d85d699f7-d86dl\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.009275 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-config-data-custom\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.009351 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-config-data\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.009493 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk68g\" (UniqueName: \"kubernetes.io/projected/622dd478-18fc-4c49-8019-1f12bf96e43e-kube-api-access-wk68g\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.009515 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-combined-ca-bundle\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.009586 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622dd478-18fc-4c49-8019-1f12bf96e43e-logs\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.009990 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622dd478-18fc-4c49-8019-1f12bf96e43e-logs\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.013208 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-config-data-custom\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.018509 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-combined-ca-bundle\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.018899 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-config-data\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.026292 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk68g\" (UniqueName: \"kubernetes.io/projected/622dd478-18fc-4c49-8019-1f12bf96e43e-kube-api-access-wk68g\") pod \"barbican-api-6c7499b5fd-mcb8g\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.095261 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.105633 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.110082 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-config-data\") pod \"01f8105a-29c8-4fef-806a-e1ed266242bf\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.110543 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01f8105a-29c8-4fef-806a-e1ed266242bf-log-httpd\") pod \"01f8105a-29c8-4fef-806a-e1ed266242bf\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.110667 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-combined-ca-bundle\") pod \"01f8105a-29c8-4fef-806a-e1ed266242bf\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.110745 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk9qf\" (UniqueName: \"kubernetes.io/projected/01f8105a-29c8-4fef-806a-e1ed266242bf-kube-api-access-gk9qf\") pod \"01f8105a-29c8-4fef-806a-e1ed266242bf\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.110830 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-scripts\") pod \"01f8105a-29c8-4fef-806a-e1ed266242bf\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.110907 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01f8105a-29c8-4fef-806a-e1ed266242bf-run-httpd\") pod \"01f8105a-29c8-4fef-806a-e1ed266242bf\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.111365 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-sg-core-conf-yaml\") pod \"01f8105a-29c8-4fef-806a-e1ed266242bf\" (UID: \"01f8105a-29c8-4fef-806a-e1ed266242bf\") " Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.110857 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01f8105a-29c8-4fef-806a-e1ed266242bf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "01f8105a-29c8-4fef-806a-e1ed266242bf" (UID: "01f8105a-29c8-4fef-806a-e1ed266242bf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.114666 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01f8105a-29c8-4fef-806a-e1ed266242bf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "01f8105a-29c8-4fef-806a-e1ed266242bf" (UID: "01f8105a-29c8-4fef-806a-e1ed266242bf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.117206 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-scripts" (OuterVolumeSpecName: "scripts") pod "01f8105a-29c8-4fef-806a-e1ed266242bf" (UID: "01f8105a-29c8-4fef-806a-e1ed266242bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.117288 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f8105a-29c8-4fef-806a-e1ed266242bf-kube-api-access-gk9qf" (OuterVolumeSpecName: "kube-api-access-gk9qf") pod "01f8105a-29c8-4fef-806a-e1ed266242bf" (UID: "01f8105a-29c8-4fef-806a-e1ed266242bf"). InnerVolumeSpecName "kube-api-access-gk9qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.122110 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.138576 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.138701 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01f8105a-29c8-4fef-806a-e1ed266242bf","Type":"ContainerDied","Data":"cea49f75029dbf1db75154735fd9c6cecbd49c5ad37eaf733c727c2552dd75fc"} Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.138744 4845 scope.go:117] "RemoveContainer" containerID="8a0e94c70de0df0ba83c97f7dbd701bb277d2194c426c5f900dbd4bcef3dfae1" Oct 06 07:01:43 crc kubenswrapper[4845]: E1006 07:01:43.161065 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:b78cfc68a577b1553523c8a70a34e297\\\"\"" pod="openstack/cinder-db-sync-4wcpl" podUID="8c8a9f74-3441-4e68-8014-08472abf6680" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.161673 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "01f8105a-29c8-4fef-806a-e1ed266242bf" (UID: "01f8105a-29c8-4fef-806a-e1ed266242bf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.175420 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-config-data" (OuterVolumeSpecName: "config-data") pod "01f8105a-29c8-4fef-806a-e1ed266242bf" (UID: "01f8105a-29c8-4fef-806a-e1ed266242bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.181253 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01f8105a-29c8-4fef-806a-e1ed266242bf" (UID: "01f8105a-29c8-4fef-806a-e1ed266242bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.214334 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.214594 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.214797 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01f8105a-29c8-4fef-806a-e1ed266242bf-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.214993 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.215092 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01f8105a-29c8-4fef-806a-e1ed266242bf-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.215130 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk9qf\" (UniqueName: \"kubernetes.io/projected/01f8105a-29c8-4fef-806a-e1ed266242bf-kube-api-access-gk9qf\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.215498 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01f8105a-29c8-4fef-806a-e1ed266242bf-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.224428 4845 scope.go:117] "RemoveContainer" containerID="56647ac1ce65e6636ec06d1d9263462cc93a281ac5686ad1707654b2dcab7ce4" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.264034 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f8b69b5-x65p4"] Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.280242 4845 scope.go:117] "RemoveContainer" containerID="5696d56e23da5efa00acfd5b22eb0a7490b3ff5ce5290737fa0d9bed7d2af015" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.488873 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c5976ff76-gcfvw"] Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.501442 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d85949b65-6bf6r"] Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.545540 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.560324 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.572327 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:01:43 crc kubenswrapper[4845]: E1006 07:01:43.572770 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerName="sg-core" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.572788 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerName="sg-core" Oct 06 07:01:43 crc kubenswrapper[4845]: E1006 07:01:43.572810 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerName="ceilometer-notification-agent" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.572816 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerName="ceilometer-notification-agent" Oct 06 07:01:43 crc kubenswrapper[4845]: E1006 07:01:43.572826 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerName="ceilometer-central-agent" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.572842 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerName="ceilometer-central-agent" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.573007 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerName="ceilometer-notification-agent" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.573022 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerName="sg-core" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.573033 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" containerName="ceilometer-central-agent" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.574664 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.577671 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.578052 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 07:01:43 crc kubenswrapper[4845]: W1006 07:01:43.595565 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c0f4b65_1983_440e_a63a_e239797757eb.slice/crio-52b9b4743bd4b7d4d44609d4380aa46830c70d7bcee93c15a48916b951e28be1 WatchSource:0}: Error finding container 52b9b4743bd4b7d4d44609d4380aa46830c70d7bcee93c15a48916b951e28be1: Status 404 returned error can't find the container with id 52b9b4743bd4b7d4d44609d4380aa46830c70d7bcee93c15a48916b951e28be1 Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.602878 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.615029 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d85d699f7-d86dl"] Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.652295 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c7499b5fd-mcb8g"] Oct 06 07:01:43 crc kubenswrapper[4845]: W1006 07:01:43.655009 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod622dd478_18fc_4c49_8019_1f12bf96e43e.slice/crio-85aa4976cba23eaa79a14797be4a76d4c4a64e73268e0a04803af4b1c7f8eb3c WatchSource:0}: Error finding container 85aa4976cba23eaa79a14797be4a76d4c4a64e73268e0a04803af4b1c7f8eb3c: Status 404 returned error can't find the container with id 85aa4976cba23eaa79a14797be4a76d4c4a64e73268e0a04803af4b1c7f8eb3c Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.724806 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5574b86b4d-dzxdt"] Oct 06 07:01:43 crc kubenswrapper[4845]: W1006 07:01:43.729150 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e6e1657_fd8e_4dd4_b3be_166870ab020a.slice/crio-8161852c4b9a44a55dfb35197ca62721baa0d33100a3c692bc5fad6557a7a68a WatchSource:0}: Error finding container 8161852c4b9a44a55dfb35197ca62721baa0d33100a3c692bc5fad6557a7a68a: Status 404 returned error can't find the container with id 8161852c4b9a44a55dfb35197ca62721baa0d33100a3c692bc5fad6557a7a68a Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.729618 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-scripts\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.729690 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17a717dd-d883-477e-9894-653d2281a966-log-httpd\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.729783 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-config-data\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.729809 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.729830 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17a717dd-d883-477e-9894-653d2281a966-run-httpd\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.729855 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.729893 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l6x7\" (UniqueName: \"kubernetes.io/projected/17a717dd-d883-477e-9894-653d2281a966-kube-api-access-9l6x7\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.831645 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-scripts\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.831730 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17a717dd-d883-477e-9894-653d2281a966-log-httpd\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.831828 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-config-data\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.831856 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.831894 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17a717dd-d883-477e-9894-653d2281a966-run-httpd\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.831920 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.832018 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l6x7\" (UniqueName: \"kubernetes.io/projected/17a717dd-d883-477e-9894-653d2281a966-kube-api-access-9l6x7\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.833045 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17a717dd-d883-477e-9894-653d2281a966-run-httpd\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.833120 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17a717dd-d883-477e-9894-653d2281a966-log-httpd\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.837328 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-scripts\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.837907 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.838640 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-config-data\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.839909 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.866479 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l6x7\" (UniqueName: \"kubernetes.io/projected/17a717dd-d883-477e-9894-653d2281a966-kube-api-access-9l6x7\") pod \"ceilometer-0\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " pod="openstack/ceilometer-0" Oct 06 07:01:43 crc kubenswrapper[4845]: I1006 07:01:43.897428 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.174815 4845 generic.go:334] "Generic (PLEG): container finished" podID="2e1193ce-a230-46ae-bef2-747405888b97" containerID="5abb18616e837182327d67e0fbf7bfd9b7204b330c2687dfd13dd6ee394ca93d" exitCode=0 Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.174976 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" event={"ID":"2e1193ce-a230-46ae-bef2-747405888b97","Type":"ContainerDied","Data":"5abb18616e837182327d67e0fbf7bfd9b7204b330c2687dfd13dd6ee394ca93d"} Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.175112 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" event={"ID":"2e1193ce-a230-46ae-bef2-747405888b97","Type":"ContainerStarted","Data":"cc9f2246e54e52d078d95db930bf12e7244c84254e449dba63d51de3fbc48783"} Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.178091 4845 generic.go:334] "Generic (PLEG): container finished" podID="3c0f4b65-1983-440e-a63a-e239797757eb" containerID="2c122a44832cbca574968c775582e03d71a3d4ec67cf3ac007ec2d7e1fe56546" exitCode=0 Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.178224 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" event={"ID":"3c0f4b65-1983-440e-a63a-e239797757eb","Type":"ContainerDied","Data":"2c122a44832cbca574968c775582e03d71a3d4ec67cf3ac007ec2d7e1fe56546"} Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.178255 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" event={"ID":"3c0f4b65-1983-440e-a63a-e239797757eb","Type":"ContainerStarted","Data":"52b9b4743bd4b7d4d44609d4380aa46830c70d7bcee93c15a48916b951e28be1"} Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.181847 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5574b86b4d-dzxdt" event={"ID":"7e6e1657-fd8e-4dd4-b3be-166870ab020a","Type":"ContainerStarted","Data":"04d5afdbea5e58792aa4d2401c5f7eb9f446eb9bc74726a628bd7d0469805e10"} Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.181893 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5574b86b4d-dzxdt" event={"ID":"7e6e1657-fd8e-4dd4-b3be-166870ab020a","Type":"ContainerStarted","Data":"8161852c4b9a44a55dfb35197ca62721baa0d33100a3c692bc5fad6557a7a68a"} Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.186221 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" event={"ID":"3b247c0d-1911-47a7-83bc-fad6ee6d6172","Type":"ContainerStarted","Data":"406bcf5ad98cc83022b8a7dad372351168c573477b0655417f51cc229742897c"} Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.193629 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c7499b5fd-mcb8g" event={"ID":"622dd478-18fc-4c49-8019-1f12bf96e43e","Type":"ContainerStarted","Data":"9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8"} Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.193676 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c7499b5fd-mcb8g" event={"ID":"622dd478-18fc-4c49-8019-1f12bf96e43e","Type":"ContainerStarted","Data":"85aa4976cba23eaa79a14797be4a76d4c4a64e73268e0a04803af4b1c7f8eb3c"} Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.195028 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d85949b65-6bf6r" event={"ID":"a8cd370d-1327-4f32-a12d-e43c99f63f23","Type":"ContainerStarted","Data":"016de379e4aea54a294aed2489927f49a093853995603dc5c1c2dabfb1fc48c0"} Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.252904 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f8105a-29c8-4fef-806a-e1ed266242bf" path="/var/lib/kubelet/pods/01f8105a-29c8-4fef-806a-e1ed266242bf/volumes" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.350014 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.458651 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-767548595f-nndsw"] Oct 06 07:01:44 crc kubenswrapper[4845]: W1006 07:01:44.782611 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17a717dd_d883_477e_9894_653d2281a966.slice/crio-8d1aed851da3395288e193f555f233b1ee279c4059f6c16e428b407ada6c794f WatchSource:0}: Error finding container 8d1aed851da3395288e193f555f233b1ee279c4059f6c16e428b407ada6c794f: Status 404 returned error can't find the container with id 8d1aed851da3395288e193f555f233b1ee279c4059f6c16e428b407ada6c794f Oct 06 07:01:44 crc kubenswrapper[4845]: W1006 07:01:44.786920 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf554586f_3f7f_4fe0_9a1b_0ff75662c2e2.slice/crio-cd680ea3d5b502c9c0240647cb640b6f64b62ddd952eb2d4674765094a25a12f WatchSource:0}: Error finding container cd680ea3d5b502c9c0240647cb640b6f64b62ddd952eb2d4674765094a25a12f: Status 404 returned error can't find the container with id cd680ea3d5b502c9c0240647cb640b6f64b62ddd952eb2d4674765094a25a12f Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.821972 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.860270 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-dns-svc\") pod \"2e1193ce-a230-46ae-bef2-747405888b97\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.860328 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-ovsdbserver-nb\") pod \"2e1193ce-a230-46ae-bef2-747405888b97\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.860593 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgmcr\" (UniqueName: \"kubernetes.io/projected/2e1193ce-a230-46ae-bef2-747405888b97-kube-api-access-xgmcr\") pod \"2e1193ce-a230-46ae-bef2-747405888b97\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.860679 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-config\") pod \"2e1193ce-a230-46ae-bef2-747405888b97\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.860715 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-dns-swift-storage-0\") pod \"2e1193ce-a230-46ae-bef2-747405888b97\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.860748 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-ovsdbserver-sb\") pod \"2e1193ce-a230-46ae-bef2-747405888b97\" (UID: \"2e1193ce-a230-46ae-bef2-747405888b97\") " Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.867758 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e1193ce-a230-46ae-bef2-747405888b97-kube-api-access-xgmcr" (OuterVolumeSpecName: "kube-api-access-xgmcr") pod "2e1193ce-a230-46ae-bef2-747405888b97" (UID: "2e1193ce-a230-46ae-bef2-747405888b97"). InnerVolumeSpecName "kube-api-access-xgmcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.909893 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e1193ce-a230-46ae-bef2-747405888b97" (UID: "2e1193ce-a230-46ae-bef2-747405888b97"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.910501 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e1193ce-a230-46ae-bef2-747405888b97" (UID: "2e1193ce-a230-46ae-bef2-747405888b97"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.911574 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e1193ce-a230-46ae-bef2-747405888b97" (UID: "2e1193ce-a230-46ae-bef2-747405888b97"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.916777 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e1193ce-a230-46ae-bef2-747405888b97" (UID: "2e1193ce-a230-46ae-bef2-747405888b97"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.919064 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-config" (OuterVolumeSpecName: "config") pod "2e1193ce-a230-46ae-bef2-747405888b97" (UID: "2e1193ce-a230-46ae-bef2-747405888b97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.962391 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgmcr\" (UniqueName: \"kubernetes.io/projected/2e1193ce-a230-46ae-bef2-747405888b97-kube-api-access-xgmcr\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.962418 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.962428 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.962436 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.962444 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:44 crc kubenswrapper[4845]: I1006 07:01:44.962487 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e1193ce-a230-46ae-bef2-747405888b97-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.206072 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-767548595f-nndsw" event={"ID":"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2","Type":"ContainerStarted","Data":"cd680ea3d5b502c9c0240647cb640b6f64b62ddd952eb2d4674765094a25a12f"} Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.208215 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.208211 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f8b69b5-x65p4" event={"ID":"2e1193ce-a230-46ae-bef2-747405888b97","Type":"ContainerDied","Data":"cc9f2246e54e52d078d95db930bf12e7244c84254e449dba63d51de3fbc48783"} Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.208267 4845 scope.go:117] "RemoveContainer" containerID="5abb18616e837182327d67e0fbf7bfd9b7204b330c2687dfd13dd6ee394ca93d" Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.225230 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" event={"ID":"3c0f4b65-1983-440e-a63a-e239797757eb","Type":"ContainerStarted","Data":"7593053bad6c2906ce130b20b159c868b10c99ebd98ed1e464fcc5beb99d069a"} Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.226820 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.230418 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5574b86b4d-dzxdt" event={"ID":"7e6e1657-fd8e-4dd4-b3be-166870ab020a","Type":"ContainerStarted","Data":"9b9093e8cd2362231534c1a487b1e59a14405f7b8e314cccfb65cf7cebcbc1ce"} Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.230627 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.232498 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17a717dd-d883-477e-9894-653d2281a966","Type":"ContainerStarted","Data":"8d1aed851da3395288e193f555f233b1ee279c4059f6c16e428b407ada6c794f"} Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.238959 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c7499b5fd-mcb8g" event={"ID":"622dd478-18fc-4c49-8019-1f12bf96e43e","Type":"ContainerStarted","Data":"a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3"} Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.239211 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.239232 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.281009 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" podStartSLOduration=3.280979317 podStartE2EDuration="3.280979317s" podCreationTimestamp="2025-10-06 07:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:45.249706083 +0000 UTC m=+989.764447081" watchObservedRunningTime="2025-10-06 07:01:45.280979317 +0000 UTC m=+989.795720345" Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.292262 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5574b86b4d-dzxdt" podStartSLOduration=12.292241086 podStartE2EDuration="12.292241086s" podCreationTimestamp="2025-10-06 07:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:45.270491477 +0000 UTC m=+989.785232495" watchObservedRunningTime="2025-10-06 07:01:45.292241086 +0000 UTC m=+989.806982104" Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.323384 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6c7499b5fd-mcb8g" podStartSLOduration=3.323349386 podStartE2EDuration="3.323349386s" podCreationTimestamp="2025-10-06 07:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:45.29199631 +0000 UTC m=+989.806737318" watchObservedRunningTime="2025-10-06 07:01:45.323349386 +0000 UTC m=+989.838090404" Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.421858 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f8b69b5-x65p4"] Oct 06 07:01:45 crc kubenswrapper[4845]: I1006 07:01:45.435736 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79f8b69b5-x65p4"] Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.239575 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e1193ce-a230-46ae-bef2-747405888b97" path="/var/lib/kubelet/pods/2e1193ce-a230-46ae-bef2-747405888b97/volumes" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.253734 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17a717dd-d883-477e-9894-653d2281a966","Type":"ContainerStarted","Data":"e0ab01dd95166993806b5271a47be68edd03b7ed0e90cb14a391ab1788827a98"} Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.253780 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17a717dd-d883-477e-9894-653d2281a966","Type":"ContainerStarted","Data":"2167885e23ddb5802aae3cde92cf7e5c5b9a70e5913c3ca2fd4df7d58d433565"} Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.262264 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" event={"ID":"3b247c0d-1911-47a7-83bc-fad6ee6d6172","Type":"ContainerStarted","Data":"63596806e760815d5b702c9636999ba266c73f6bdb29b493abd93069c7d40b93"} Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.262312 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" event={"ID":"3b247c0d-1911-47a7-83bc-fad6ee6d6172","Type":"ContainerStarted","Data":"24979e228929e704d2fa8de35d0e418d6693e72051d3ecddf5857429dc6ab24b"} Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.264606 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-767548595f-nndsw" event={"ID":"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2","Type":"ContainerStarted","Data":"c1388cebe0628e99fe11cba03298cfc215e876e6faf768a164de0292c1fa513d"} Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.264632 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-767548595f-nndsw" event={"ID":"f554586f-3f7f-4fe0-9a1b-0ff75662c2e2","Type":"ContainerStarted","Data":"59c0686720aea42e4b726b818a7caede14c97990b9156de12ee3029dc1e63cb8"} Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.264744 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-767548595f-nndsw" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.266982 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d85949b65-6bf6r" event={"ID":"a8cd370d-1327-4f32-a12d-e43c99f63f23","Type":"ContainerStarted","Data":"b36ac52dfda60f3429b2cfb7f41cf183b635dd4fa5ee3b40ec4c32854bcda849"} Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.267019 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d85949b65-6bf6r" event={"ID":"a8cd370d-1327-4f32-a12d-e43c99f63f23","Type":"ContainerStarted","Data":"fe7f773840c976d618c8affc612505f3a02bbb8125241fca20f267c07aec7ec1"} Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.346932 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-c5976ff76-gcfvw" podStartSLOduration=2.565456219 podStartE2EDuration="4.346917595s" podCreationTimestamp="2025-10-06 07:01:42 +0000 UTC" firstStartedPulling="2025-10-06 07:01:43.482316418 +0000 UTC m=+987.997057426" lastFinishedPulling="2025-10-06 07:01:45.263777784 +0000 UTC m=+989.778518802" observedRunningTime="2025-10-06 07:01:46.343845496 +0000 UTC m=+990.858586504" watchObservedRunningTime="2025-10-06 07:01:46.346917595 +0000 UTC m=+990.861658603" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.396186 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7d85949b65-6bf6r" podStartSLOduration=2.6126894529999998 podStartE2EDuration="4.396167221s" podCreationTimestamp="2025-10-06 07:01:42 +0000 UTC" firstStartedPulling="2025-10-06 07:01:43.483644922 +0000 UTC m=+987.998385930" lastFinishedPulling="2025-10-06 07:01:45.26712269 +0000 UTC m=+989.781863698" observedRunningTime="2025-10-06 07:01:46.39456086 +0000 UTC m=+990.909301868" watchObservedRunningTime="2025-10-06 07:01:46.396167221 +0000 UTC m=+990.910908229" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.399911 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-767548595f-nndsw" podStartSLOduration=11.399899867 podStartE2EDuration="11.399899867s" podCreationTimestamp="2025-10-06 07:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:46.372659447 +0000 UTC m=+990.887400455" watchObservedRunningTime="2025-10-06 07:01:46.399899867 +0000 UTC m=+990.914640875" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.593382 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56f94c67bb-rmg7r"] Oct 06 07:01:46 crc kubenswrapper[4845]: E1006 07:01:46.593771 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e1193ce-a230-46ae-bef2-747405888b97" containerName="init" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.593790 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e1193ce-a230-46ae-bef2-747405888b97" containerName="init" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.593992 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e1193ce-a230-46ae-bef2-747405888b97" containerName="init" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.594915 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.608695 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56f94c67bb-rmg7r"] Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.615017 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.618220 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.659534 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82278b6-977b-40db-b925-10f8d7621e7c-logs\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.659655 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-internal-tls-certs\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.659679 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-config-data\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.659698 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnzv5\" (UniqueName: \"kubernetes.io/projected/d82278b6-977b-40db-b925-10f8d7621e7c-kube-api-access-dnzv5\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.659721 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-public-tls-certs\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.659736 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-config-data-custom\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.659802 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-combined-ca-bundle\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.761017 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-internal-tls-certs\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.761055 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-config-data\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.761073 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnzv5\" (UniqueName: \"kubernetes.io/projected/d82278b6-977b-40db-b925-10f8d7621e7c-kube-api-access-dnzv5\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.761877 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-public-tls-certs\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.761898 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-config-data-custom\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.762245 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-combined-ca-bundle\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.762285 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82278b6-977b-40db-b925-10f8d7621e7c-logs\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.762754 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82278b6-977b-40db-b925-10f8d7621e7c-logs\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.765730 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-config-data-custom\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.766909 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-internal-tls-certs\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.767165 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-config-data\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.768600 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-combined-ca-bundle\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.769311 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d82278b6-977b-40db-b925-10f8d7621e7c-public-tls-certs\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.778024 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnzv5\" (UniqueName: \"kubernetes.io/projected/d82278b6-977b-40db-b925-10f8d7621e7c-kube-api-access-dnzv5\") pod \"barbican-api-56f94c67bb-rmg7r\" (UID: \"d82278b6-977b-40db-b925-10f8d7621e7c\") " pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:46 crc kubenswrapper[4845]: I1006 07:01:46.974271 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:47 crc kubenswrapper[4845]: I1006 07:01:47.444876 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56f94c67bb-rmg7r"] Oct 06 07:01:48 crc kubenswrapper[4845]: I1006 07:01:48.309569 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f94c67bb-rmg7r" event={"ID":"d82278b6-977b-40db-b925-10f8d7621e7c","Type":"ContainerStarted","Data":"3acd5cf3279cdba510e3dc9b1af64d7a1e46f2e3deb1fa88b9029a4c49aef9af"} Oct 06 07:01:48 crc kubenswrapper[4845]: I1006 07:01:48.309832 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f94c67bb-rmg7r" event={"ID":"d82278b6-977b-40db-b925-10f8d7621e7c","Type":"ContainerStarted","Data":"aebb434edaa91d7cfd4926cab57964d43813dbcb7619e0032cf8ebc7017f9b1d"} Oct 06 07:01:48 crc kubenswrapper[4845]: I1006 07:01:48.309844 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f94c67bb-rmg7r" event={"ID":"d82278b6-977b-40db-b925-10f8d7621e7c","Type":"ContainerStarted","Data":"a7fa26f15d939ad3c89fb66bd1bad98896fbbbd720e1203ce12af1a68bacaccc"} Oct 06 07:01:48 crc kubenswrapper[4845]: I1006 07:01:48.309857 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:48 crc kubenswrapper[4845]: I1006 07:01:48.309867 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:48 crc kubenswrapper[4845]: I1006 07:01:48.333148 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56f94c67bb-rmg7r" podStartSLOduration=2.333127084 podStartE2EDuration="2.333127084s" podCreationTimestamp="2025-10-06 07:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:01:48.330177658 +0000 UTC m=+992.844918686" watchObservedRunningTime="2025-10-06 07:01:48.333127084 +0000 UTC m=+992.847868092" Oct 06 07:01:49 crc kubenswrapper[4845]: I1006 07:01:49.691863 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:50 crc kubenswrapper[4845]: I1006 07:01:50.415366 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:51 crc kubenswrapper[4845]: I1006 07:01:51.266786 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:01:51 crc kubenswrapper[4845]: I1006 07:01:51.486004 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8469844cbb-s9qws" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.096509 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.168483 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b48f67c-jvz42"] Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.168794 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" podUID="3c3fe663-8237-4fba-9801-a75d52661c35" containerName="dnsmasq-dns" containerID="cri-o://72beee165d533fbf20361b0a48c794946dc11415ccd65537901e2ffd2bde35cc" gracePeriod=10 Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.392892 4845 generic.go:334] "Generic (PLEG): container finished" podID="3c3fe663-8237-4fba-9801-a75d52661c35" containerID="72beee165d533fbf20361b0a48c794946dc11415ccd65537901e2ffd2bde35cc" exitCode=0 Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.392939 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" event={"ID":"3c3fe663-8237-4fba-9801-a75d52661c35","Type":"ContainerDied","Data":"72beee165d533fbf20361b0a48c794946dc11415ccd65537901e2ffd2bde35cc"} Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.742899 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.808494 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-dns-svc\") pod \"3c3fe663-8237-4fba-9801-a75d52661c35\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.808561 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-ovsdbserver-nb\") pod \"3c3fe663-8237-4fba-9801-a75d52661c35\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.808653 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kxnk\" (UniqueName: \"kubernetes.io/projected/3c3fe663-8237-4fba-9801-a75d52661c35-kube-api-access-4kxnk\") pod \"3c3fe663-8237-4fba-9801-a75d52661c35\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.808672 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-dns-swift-storage-0\") pod \"3c3fe663-8237-4fba-9801-a75d52661c35\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.808723 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-ovsdbserver-sb\") pod \"3c3fe663-8237-4fba-9801-a75d52661c35\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.808741 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-config\") pod \"3c3fe663-8237-4fba-9801-a75d52661c35\" (UID: \"3c3fe663-8237-4fba-9801-a75d52661c35\") " Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.841557 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c3fe663-8237-4fba-9801-a75d52661c35-kube-api-access-4kxnk" (OuterVolumeSpecName: "kube-api-access-4kxnk") pod "3c3fe663-8237-4fba-9801-a75d52661c35" (UID: "3c3fe663-8237-4fba-9801-a75d52661c35"). InnerVolumeSpecName "kube-api-access-4kxnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.858077 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3c3fe663-8237-4fba-9801-a75d52661c35" (UID: "3c3fe663-8237-4fba-9801-a75d52661c35"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.866209 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3c3fe663-8237-4fba-9801-a75d52661c35" (UID: "3c3fe663-8237-4fba-9801-a75d52661c35"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.866647 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3c3fe663-8237-4fba-9801-a75d52661c35" (UID: "3c3fe663-8237-4fba-9801-a75d52661c35"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.880648 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3c3fe663-8237-4fba-9801-a75d52661c35" (UID: "3c3fe663-8237-4fba-9801-a75d52661c35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.885824 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-config" (OuterVolumeSpecName: "config") pod "3c3fe663-8237-4fba-9801-a75d52661c35" (UID: "3c3fe663-8237-4fba-9801-a75d52661c35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.924673 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.924738 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.924751 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.924760 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.924769 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kxnk\" (UniqueName: \"kubernetes.io/projected/3c3fe663-8237-4fba-9801-a75d52661c35-kube-api-access-4kxnk\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:53 crc kubenswrapper[4845]: I1006 07:01:53.924777 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c3fe663-8237-4fba-9801-a75d52661c35-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:01:54 crc kubenswrapper[4845]: I1006 07:01:54.412645 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" event={"ID":"3c3fe663-8237-4fba-9801-a75d52661c35","Type":"ContainerDied","Data":"25a9f9a9ef776d3bbc3e2d9fb1569aa65968f28bef9fb05de2fd31b6290506be"} Oct 06 07:01:54 crc kubenswrapper[4845]: I1006 07:01:54.412716 4845 scope.go:117] "RemoveContainer" containerID="72beee165d533fbf20361b0a48c794946dc11415ccd65537901e2ffd2bde35cc" Oct 06 07:01:54 crc kubenswrapper[4845]: I1006 07:01:54.412891 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b48f67c-jvz42" Oct 06 07:01:54 crc kubenswrapper[4845]: I1006 07:01:54.463018 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b48f67c-jvz42"] Oct 06 07:01:54 crc kubenswrapper[4845]: I1006 07:01:54.468987 4845 scope.go:117] "RemoveContainer" containerID="c96ff7b3d53d39f363c7f6abce78f9345f1320b4288a580615ea1bf6635864af" Oct 06 07:01:54 crc kubenswrapper[4845]: I1006 07:01:54.483395 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69b48f67c-jvz42"] Oct 06 07:01:54 crc kubenswrapper[4845]: I1006 07:01:54.816570 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-67cb66d46f-6rxvh" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.490671 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 07:01:55 crc kubenswrapper[4845]: E1006 07:01:55.491304 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3fe663-8237-4fba-9801-a75d52661c35" containerName="init" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.491327 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3fe663-8237-4fba-9801-a75d52661c35" containerName="init" Oct 06 07:01:55 crc kubenswrapper[4845]: E1006 07:01:55.491410 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3fe663-8237-4fba-9801-a75d52661c35" containerName="dnsmasq-dns" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.491423 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3fe663-8237-4fba-9801-a75d52661c35" containerName="dnsmasq-dns" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.491747 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c3fe663-8237-4fba-9801-a75d52661c35" containerName="dnsmasq-dns" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.493000 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.496612 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.497826 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-k4sch" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.498002 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.505016 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.560239 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec118969-bd05-449c-bb6b-a460bda1b79a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ec118969-bd05-449c-bb6b-a460bda1b79a\") " pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.560715 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ec118969-bd05-449c-bb6b-a460bda1b79a-openstack-config\") pod \"openstackclient\" (UID: \"ec118969-bd05-449c-bb6b-a460bda1b79a\") " pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.560780 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ec118969-bd05-449c-bb6b-a460bda1b79a-openstack-config-secret\") pod \"openstackclient\" (UID: \"ec118969-bd05-449c-bb6b-a460bda1b79a\") " pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.560899 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g79j\" (UniqueName: \"kubernetes.io/projected/ec118969-bd05-449c-bb6b-a460bda1b79a-kube-api-access-7g79j\") pod \"openstackclient\" (UID: \"ec118969-bd05-449c-bb6b-a460bda1b79a\") " pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.663335 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ec118969-bd05-449c-bb6b-a460bda1b79a-openstack-config\") pod \"openstackclient\" (UID: \"ec118969-bd05-449c-bb6b-a460bda1b79a\") " pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.663446 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ec118969-bd05-449c-bb6b-a460bda1b79a-openstack-config-secret\") pod \"openstackclient\" (UID: \"ec118969-bd05-449c-bb6b-a460bda1b79a\") " pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.663641 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g79j\" (UniqueName: \"kubernetes.io/projected/ec118969-bd05-449c-bb6b-a460bda1b79a-kube-api-access-7g79j\") pod \"openstackclient\" (UID: \"ec118969-bd05-449c-bb6b-a460bda1b79a\") " pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.663766 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec118969-bd05-449c-bb6b-a460bda1b79a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ec118969-bd05-449c-bb6b-a460bda1b79a\") " pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.664353 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ec118969-bd05-449c-bb6b-a460bda1b79a-openstack-config\") pod \"openstackclient\" (UID: \"ec118969-bd05-449c-bb6b-a460bda1b79a\") " pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.668931 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ec118969-bd05-449c-bb6b-a460bda1b79a-openstack-config-secret\") pod \"openstackclient\" (UID: \"ec118969-bd05-449c-bb6b-a460bda1b79a\") " pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.669098 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec118969-bd05-449c-bb6b-a460bda1b79a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ec118969-bd05-449c-bb6b-a460bda1b79a\") " pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.685844 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g79j\" (UniqueName: \"kubernetes.io/projected/ec118969-bd05-449c-bb6b-a460bda1b79a-kube-api-access-7g79j\") pod \"openstackclient\" (UID: \"ec118969-bd05-449c-bb6b-a460bda1b79a\") " pod="openstack/openstackclient" Oct 06 07:01:55 crc kubenswrapper[4845]: I1006 07:01:55.825795 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 07:01:56 crc kubenswrapper[4845]: I1006 07:01:56.241831 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c3fe663-8237-4fba-9801-a75d52661c35" path="/var/lib/kubelet/pods/3c3fe663-8237-4fba-9801-a75d52661c35/volumes" Oct 06 07:01:56 crc kubenswrapper[4845]: I1006 07:01:56.316216 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 07:01:56 crc kubenswrapper[4845]: E1006 07:01:56.352077 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" image="quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1" Oct 06 07:01:56 crc kubenswrapper[4845]: E1006 07:01:56.352232 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9l6x7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(17a717dd-d883-477e-9894-653d2281a966): ErrImagePull: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" logger="UnhandledError" Oct 06 07:01:56 crc kubenswrapper[4845]: I1006 07:01:56.436846 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ec118969-bd05-449c-bb6b-a460bda1b79a","Type":"ContainerStarted","Data":"77871f2cbaa56ebfadb8556287e90cba2b7501ed7fd93f9155c40c38861ab75f"} Oct 06 07:01:57 crc kubenswrapper[4845]: I1006 07:01:57.522523 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4wcpl" event={"ID":"8c8a9f74-3441-4e68-8014-08472abf6680","Type":"ContainerStarted","Data":"f2b1ea5962862cea48cba4b0117a5621b6fc92cd223ad7186549c5d5a2d42030"} Oct 06 07:01:57 crc kubenswrapper[4845]: I1006 07:01:57.541081 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-4wcpl" podStartSLOduration=7.002365561 podStartE2EDuration="41.541056731s" podCreationTimestamp="2025-10-06 07:01:16 +0000 UTC" firstStartedPulling="2025-10-06 07:01:21.877314404 +0000 UTC m=+966.392055412" lastFinishedPulling="2025-10-06 07:01:56.416005574 +0000 UTC m=+1000.930746582" observedRunningTime="2025-10-06 07:01:57.538412403 +0000 UTC m=+1002.053153421" watchObservedRunningTime="2025-10-06 07:01:57.541056731 +0000 UTC m=+1002.055797739" Oct 06 07:01:58 crc kubenswrapper[4845]: I1006 07:01:58.575812 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:58 crc kubenswrapper[4845]: I1006 07:01:58.577852 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56f94c67bb-rmg7r" Oct 06 07:01:58 crc kubenswrapper[4845]: I1006 07:01:58.657500 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6c7499b5fd-mcb8g"] Oct 06 07:01:58 crc kubenswrapper[4845]: I1006 07:01:58.657761 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6c7499b5fd-mcb8g" podUID="622dd478-18fc-4c49-8019-1f12bf96e43e" containerName="barbican-api-log" containerID="cri-o://9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8" gracePeriod=30 Oct 06 07:01:58 crc kubenswrapper[4845]: I1006 07:01:58.658162 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6c7499b5fd-mcb8g" podUID="622dd478-18fc-4c49-8019-1f12bf96e43e" containerName="barbican-api" containerID="cri-o://a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3" gracePeriod=30 Oct 06 07:01:59 crc kubenswrapper[4845]: I1006 07:01:59.546203 4845 generic.go:334] "Generic (PLEG): container finished" podID="622dd478-18fc-4c49-8019-1f12bf96e43e" containerID="9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8" exitCode=143 Oct 06 07:01:59 crc kubenswrapper[4845]: I1006 07:01:59.546478 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c7499b5fd-mcb8g" event={"ID":"622dd478-18fc-4c49-8019-1f12bf96e43e","Type":"ContainerDied","Data":"9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8"} Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.385279 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-865c95569-jxblm"] Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.397577 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.404584 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.408486 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.437888 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-865c95569-jxblm"] Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.438614 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.478447 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3a28a2d-4deb-408d-b47b-600758782cdf-log-httpd\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.478497 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3a28a2d-4deb-408d-b47b-600758782cdf-etc-swift\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.478515 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3a28a2d-4deb-408d-b47b-600758782cdf-public-tls-certs\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.478678 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3a28a2d-4deb-408d-b47b-600758782cdf-run-httpd\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.478698 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3a28a2d-4deb-408d-b47b-600758782cdf-combined-ca-bundle\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.478771 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a28a2d-4deb-408d-b47b-600758782cdf-config-data\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.478852 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3a28a2d-4deb-408d-b47b-600758782cdf-internal-tls-certs\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.478880 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr6zl\" (UniqueName: \"kubernetes.io/projected/f3a28a2d-4deb-408d-b47b-600758782cdf-kube-api-access-gr6zl\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.581151 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3a28a2d-4deb-408d-b47b-600758782cdf-run-httpd\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.581708 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3a28a2d-4deb-408d-b47b-600758782cdf-combined-ca-bundle\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.581759 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a28a2d-4deb-408d-b47b-600758782cdf-config-data\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.581816 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3a28a2d-4deb-408d-b47b-600758782cdf-internal-tls-certs\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.581856 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr6zl\" (UniqueName: \"kubernetes.io/projected/f3a28a2d-4deb-408d-b47b-600758782cdf-kube-api-access-gr6zl\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.581902 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3a28a2d-4deb-408d-b47b-600758782cdf-log-httpd\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.581927 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3a28a2d-4deb-408d-b47b-600758782cdf-etc-swift\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.581950 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3a28a2d-4deb-408d-b47b-600758782cdf-public-tls-certs\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.582211 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3a28a2d-4deb-408d-b47b-600758782cdf-run-httpd\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.583250 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3a28a2d-4deb-408d-b47b-600758782cdf-log-httpd\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.589924 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3a28a2d-4deb-408d-b47b-600758782cdf-combined-ca-bundle\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.590543 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a28a2d-4deb-408d-b47b-600758782cdf-config-data\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.593807 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3a28a2d-4deb-408d-b47b-600758782cdf-internal-tls-certs\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.595391 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3a28a2d-4deb-408d-b47b-600758782cdf-public-tls-certs\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.595979 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3a28a2d-4deb-408d-b47b-600758782cdf-etc-swift\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.606685 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr6zl\" (UniqueName: \"kubernetes.io/projected/f3a28a2d-4deb-408d-b47b-600758782cdf-kube-api-access-gr6zl\") pod \"swift-proxy-865c95569-jxblm\" (UID: \"f3a28a2d-4deb-408d-b47b-600758782cdf\") " pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:00 crc kubenswrapper[4845]: I1006 07:02:00.771669 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:01 crc kubenswrapper[4845]: E1006 07:02:01.102780 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out\"" pod="openstack/ceilometer-0" podUID="17a717dd-d883-477e-9894-653d2281a966" Oct 06 07:02:01 crc kubenswrapper[4845]: I1006 07:02:01.338219 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:01 crc kubenswrapper[4845]: I1006 07:02:01.439809 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-865c95569-jxblm"] Oct 06 07:02:01 crc kubenswrapper[4845]: I1006 07:02:01.579163 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17a717dd-d883-477e-9894-653d2281a966","Type":"ContainerStarted","Data":"31c997e3e14c26082ad488cbe1050af4960dc495e806a905789d6740cdf97478"} Oct 06 07:02:01 crc kubenswrapper[4845]: I1006 07:02:01.579384 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 07:02:01 crc kubenswrapper[4845]: I1006 07:02:01.581054 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-865c95569-jxblm" event={"ID":"f3a28a2d-4deb-408d-b47b-600758782cdf","Type":"ContainerStarted","Data":"c22e74824c9dc8891d97d3a4f2babec37ac59c198f1ef09839fa2d7e0d5c9061"} Oct 06 07:02:01 crc kubenswrapper[4845]: E1006 07:02:01.581227 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1\\\"\"" pod="openstack/ceilometer-0" podUID="17a717dd-d883-477e-9894-653d2281a966" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.256232 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.331071 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-combined-ca-bundle\") pod \"622dd478-18fc-4c49-8019-1f12bf96e43e\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.331150 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622dd478-18fc-4c49-8019-1f12bf96e43e-logs\") pod \"622dd478-18fc-4c49-8019-1f12bf96e43e\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.331331 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-config-data-custom\") pod \"622dd478-18fc-4c49-8019-1f12bf96e43e\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.331488 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-config-data\") pod \"622dd478-18fc-4c49-8019-1f12bf96e43e\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.331658 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk68g\" (UniqueName: \"kubernetes.io/projected/622dd478-18fc-4c49-8019-1f12bf96e43e-kube-api-access-wk68g\") pod \"622dd478-18fc-4c49-8019-1f12bf96e43e\" (UID: \"622dd478-18fc-4c49-8019-1f12bf96e43e\") " Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.332402 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/622dd478-18fc-4c49-8019-1f12bf96e43e-logs" (OuterVolumeSpecName: "logs") pod "622dd478-18fc-4c49-8019-1f12bf96e43e" (UID: "622dd478-18fc-4c49-8019-1f12bf96e43e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.332995 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622dd478-18fc-4c49-8019-1f12bf96e43e-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.337553 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "622dd478-18fc-4c49-8019-1f12bf96e43e" (UID: "622dd478-18fc-4c49-8019-1f12bf96e43e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.338667 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/622dd478-18fc-4c49-8019-1f12bf96e43e-kube-api-access-wk68g" (OuterVolumeSpecName: "kube-api-access-wk68g") pod "622dd478-18fc-4c49-8019-1f12bf96e43e" (UID: "622dd478-18fc-4c49-8019-1f12bf96e43e"). InnerVolumeSpecName "kube-api-access-wk68g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.357052 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "622dd478-18fc-4c49-8019-1f12bf96e43e" (UID: "622dd478-18fc-4c49-8019-1f12bf96e43e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.382310 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-config-data" (OuterVolumeSpecName: "config-data") pod "622dd478-18fc-4c49-8019-1f12bf96e43e" (UID: "622dd478-18fc-4c49-8019-1f12bf96e43e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.435760 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.435809 4845 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.435823 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622dd478-18fc-4c49-8019-1f12bf96e43e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.435836 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk68g\" (UniqueName: \"kubernetes.io/projected/622dd478-18fc-4c49-8019-1f12bf96e43e-kube-api-access-wk68g\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.607659 4845 generic.go:334] "Generic (PLEG): container finished" podID="622dd478-18fc-4c49-8019-1f12bf96e43e" containerID="a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3" exitCode=0 Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.607769 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c7499b5fd-mcb8g" event={"ID":"622dd478-18fc-4c49-8019-1f12bf96e43e","Type":"ContainerDied","Data":"a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3"} Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.607799 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c7499b5fd-mcb8g" event={"ID":"622dd478-18fc-4c49-8019-1f12bf96e43e","Type":"ContainerDied","Data":"85aa4976cba23eaa79a14797be4a76d4c4a64e73268e0a04803af4b1c7f8eb3c"} Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.607816 4845 scope.go:117] "RemoveContainer" containerID="a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.607963 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c7499b5fd-mcb8g" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.615947 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17a717dd-d883-477e-9894-653d2281a966" containerName="ceilometer-central-agent" containerID="cri-o://2167885e23ddb5802aae3cde92cf7e5c5b9a70e5913c3ca2fd4df7d58d433565" gracePeriod=30 Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.617135 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-865c95569-jxblm" event={"ID":"f3a28a2d-4deb-408d-b47b-600758782cdf","Type":"ContainerStarted","Data":"c422b7940771c664a156405bcf33d79db8716162a7b64f263d8b87d7a3576b8a"} Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.617165 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.617175 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-865c95569-jxblm" event={"ID":"f3a28a2d-4deb-408d-b47b-600758782cdf","Type":"ContainerStarted","Data":"cc796dfef9b47cf2b8701118c80a7e59f78a1baa314624f382f6dfeda09006e5"} Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.617425 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17a717dd-d883-477e-9894-653d2281a966" containerName="proxy-httpd" containerID="cri-o://31c997e3e14c26082ad488cbe1050af4960dc495e806a905789d6740cdf97478" gracePeriod=30 Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.617480 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17a717dd-d883-477e-9894-653d2281a966" containerName="ceilometer-notification-agent" containerID="cri-o://e0ab01dd95166993806b5271a47be68edd03b7ed0e90cb14a391ab1788827a98" gracePeriod=30 Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.617537 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.649532 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-865c95569-jxblm" podStartSLOduration=2.649517931 podStartE2EDuration="2.649517931s" podCreationTimestamp="2025-10-06 07:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:02:02.645026226 +0000 UTC m=+1007.159767234" watchObservedRunningTime="2025-10-06 07:02:02.649517931 +0000 UTC m=+1007.164258939" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.691169 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4g7bh"] Oct 06 07:02:02 crc kubenswrapper[4845]: E1006 07:02:02.691809 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="622dd478-18fc-4c49-8019-1f12bf96e43e" containerName="barbican-api-log" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.691833 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="622dd478-18fc-4c49-8019-1f12bf96e43e" containerName="barbican-api-log" Oct 06 07:02:02 crc kubenswrapper[4845]: E1006 07:02:02.691867 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="622dd478-18fc-4c49-8019-1f12bf96e43e" containerName="barbican-api" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.691876 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="622dd478-18fc-4c49-8019-1f12bf96e43e" containerName="barbican-api" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.692142 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="622dd478-18fc-4c49-8019-1f12bf96e43e" containerName="barbican-api" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.692167 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="622dd478-18fc-4c49-8019-1f12bf96e43e" containerName="barbican-api-log" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.693131 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4g7bh" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.710176 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4g7bh"] Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.727164 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6c7499b5fd-mcb8g"] Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.734265 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6c7499b5fd-mcb8g"] Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.746032 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75fb6\" (UniqueName: \"kubernetes.io/projected/d4713c22-fab5-490b-b167-abae4ecfca11-kube-api-access-75fb6\") pod \"nova-api-db-create-4g7bh\" (UID: \"d4713c22-fab5-490b-b167-abae4ecfca11\") " pod="openstack/nova-api-db-create-4g7bh" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.778435 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-zdqjb"] Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.779649 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zdqjb" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.781029 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zdqjb"] Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.848232 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk5xl\" (UniqueName: \"kubernetes.io/projected/168d541f-1e89-45b6-9817-91a8242a44fd-kube-api-access-vk5xl\") pod \"nova-cell0-db-create-zdqjb\" (UID: \"168d541f-1e89-45b6-9817-91a8242a44fd\") " pod="openstack/nova-cell0-db-create-zdqjb" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.848321 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75fb6\" (UniqueName: \"kubernetes.io/projected/d4713c22-fab5-490b-b167-abae4ecfca11-kube-api-access-75fb6\") pod \"nova-api-db-create-4g7bh\" (UID: \"d4713c22-fab5-490b-b167-abae4ecfca11\") " pod="openstack/nova-api-db-create-4g7bh" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.869275 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-2ct4d"] Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.871990 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2ct4d" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.887705 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-2ct4d"] Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.904013 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75fb6\" (UniqueName: \"kubernetes.io/projected/d4713c22-fab5-490b-b167-abae4ecfca11-kube-api-access-75fb6\") pod \"nova-api-db-create-4g7bh\" (UID: \"d4713c22-fab5-490b-b167-abae4ecfca11\") " pod="openstack/nova-api-db-create-4g7bh" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.951413 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk5xl\" (UniqueName: \"kubernetes.io/projected/168d541f-1e89-45b6-9817-91a8242a44fd-kube-api-access-vk5xl\") pod \"nova-cell0-db-create-zdqjb\" (UID: \"168d541f-1e89-45b6-9817-91a8242a44fd\") " pod="openstack/nova-cell0-db-create-zdqjb" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.951477 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7xwb\" (UniqueName: \"kubernetes.io/projected/4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd-kube-api-access-h7xwb\") pod \"nova-cell1-db-create-2ct4d\" (UID: \"4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd\") " pod="openstack/nova-cell1-db-create-2ct4d" Oct 06 07:02:02 crc kubenswrapper[4845]: I1006 07:02:02.977335 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk5xl\" (UniqueName: \"kubernetes.io/projected/168d541f-1e89-45b6-9817-91a8242a44fd-kube-api-access-vk5xl\") pod \"nova-cell0-db-create-zdqjb\" (UID: \"168d541f-1e89-45b6-9817-91a8242a44fd\") " pod="openstack/nova-cell0-db-create-zdqjb" Oct 06 07:02:03 crc kubenswrapper[4845]: I1006 07:02:03.020811 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4g7bh" Oct 06 07:02:03 crc kubenswrapper[4845]: I1006 07:02:03.053771 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7xwb\" (UniqueName: \"kubernetes.io/projected/4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd-kube-api-access-h7xwb\") pod \"nova-cell1-db-create-2ct4d\" (UID: \"4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd\") " pod="openstack/nova-cell1-db-create-2ct4d" Oct 06 07:02:03 crc kubenswrapper[4845]: I1006 07:02:03.079176 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7xwb\" (UniqueName: \"kubernetes.io/projected/4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd-kube-api-access-h7xwb\") pod \"nova-cell1-db-create-2ct4d\" (UID: \"4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd\") " pod="openstack/nova-cell1-db-create-2ct4d" Oct 06 07:02:03 crc kubenswrapper[4845]: I1006 07:02:03.111012 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zdqjb" Oct 06 07:02:03 crc kubenswrapper[4845]: I1006 07:02:03.318695 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2ct4d" Oct 06 07:02:03 crc kubenswrapper[4845]: I1006 07:02:03.629346 4845 generic.go:334] "Generic (PLEG): container finished" podID="17a717dd-d883-477e-9894-653d2281a966" containerID="31c997e3e14c26082ad488cbe1050af4960dc495e806a905789d6740cdf97478" exitCode=0 Oct 06 07:02:03 crc kubenswrapper[4845]: I1006 07:02:03.629407 4845 generic.go:334] "Generic (PLEG): container finished" podID="17a717dd-d883-477e-9894-653d2281a966" containerID="2167885e23ddb5802aae3cde92cf7e5c5b9a70e5913c3ca2fd4df7d58d433565" exitCode=0 Oct 06 07:02:03 crc kubenswrapper[4845]: I1006 07:02:03.629456 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17a717dd-d883-477e-9894-653d2281a966","Type":"ContainerDied","Data":"31c997e3e14c26082ad488cbe1050af4960dc495e806a905789d6740cdf97478"} Oct 06 07:02:03 crc kubenswrapper[4845]: I1006 07:02:03.629489 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17a717dd-d883-477e-9894-653d2281a966","Type":"ContainerDied","Data":"2167885e23ddb5802aae3cde92cf7e5c5b9a70e5913c3ca2fd4df7d58d433565"} Oct 06 07:02:03 crc kubenswrapper[4845]: I1006 07:02:03.631194 4845 generic.go:334] "Generic (PLEG): container finished" podID="8c8a9f74-3441-4e68-8014-08472abf6680" containerID="f2b1ea5962862cea48cba4b0117a5621b6fc92cd223ad7186549c5d5a2d42030" exitCode=0 Oct 06 07:02:03 crc kubenswrapper[4845]: I1006 07:02:03.632481 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4wcpl" event={"ID":"8c8a9f74-3441-4e68-8014-08472abf6680","Type":"ContainerDied","Data":"f2b1ea5962862cea48cba4b0117a5621b6fc92cd223ad7186549c5d5a2d42030"} Oct 06 07:02:03 crc kubenswrapper[4845]: I1006 07:02:03.683253 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:02:04 crc kubenswrapper[4845]: I1006 07:02:04.240438 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="622dd478-18fc-4c49-8019-1f12bf96e43e" path="/var/lib/kubelet/pods/622dd478-18fc-4c49-8019-1f12bf96e43e/volumes" Oct 06 07:02:05 crc kubenswrapper[4845]: I1006 07:02:05.867403 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-767548595f-nndsw" Oct 06 07:02:05 crc kubenswrapper[4845]: I1006 07:02:05.942321 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5574b86b4d-dzxdt"] Oct 06 07:02:05 crc kubenswrapper[4845]: I1006 07:02:05.942823 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5574b86b4d-dzxdt" podUID="7e6e1657-fd8e-4dd4-b3be-166870ab020a" containerName="neutron-api" containerID="cri-o://04d5afdbea5e58792aa4d2401c5f7eb9f446eb9bc74726a628bd7d0469805e10" gracePeriod=30 Oct 06 07:02:05 crc kubenswrapper[4845]: I1006 07:02:05.942942 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5574b86b4d-dzxdt" podUID="7e6e1657-fd8e-4dd4-b3be-166870ab020a" containerName="neutron-httpd" containerID="cri-o://9b9093e8cd2362231534c1a487b1e59a14405f7b8e314cccfb65cf7cebcbc1ce" gracePeriod=30 Oct 06 07:02:06 crc kubenswrapper[4845]: I1006 07:02:06.658302 4845 generic.go:334] "Generic (PLEG): container finished" podID="7e6e1657-fd8e-4dd4-b3be-166870ab020a" containerID="9b9093e8cd2362231534c1a487b1e59a14405f7b8e314cccfb65cf7cebcbc1ce" exitCode=0 Oct 06 07:02:06 crc kubenswrapper[4845]: I1006 07:02:06.658344 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5574b86b4d-dzxdt" event={"ID":"7e6e1657-fd8e-4dd4-b3be-166870ab020a","Type":"ContainerDied","Data":"9b9093e8cd2362231534c1a487b1e59a14405f7b8e314cccfb65cf7cebcbc1ce"} Oct 06 07:02:07 crc kubenswrapper[4845]: I1006 07:02:07.672337 4845 generic.go:334] "Generic (PLEG): container finished" podID="17a717dd-d883-477e-9894-653d2281a966" containerID="e0ab01dd95166993806b5271a47be68edd03b7ed0e90cb14a391ab1788827a98" exitCode=0 Oct 06 07:02:07 crc kubenswrapper[4845]: I1006 07:02:07.672445 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17a717dd-d883-477e-9894-653d2281a966","Type":"ContainerDied","Data":"e0ab01dd95166993806b5271a47be68edd03b7ed0e90cb14a391ab1788827a98"} Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.456065 4845 scope.go:117] "RemoveContainer" containerID="9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.746208 4845 scope.go:117] "RemoveContainer" containerID="a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.751130 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4wcpl" event={"ID":"8c8a9f74-3441-4e68-8014-08472abf6680","Type":"ContainerDied","Data":"94334d48dee581d4b37fe73ef46048545a540b25769c18388bcda2deb979e81e"} Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.751172 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94334d48dee581d4b37fe73ef46048545a540b25769c18388bcda2deb979e81e" Oct 06 07:02:08 crc kubenswrapper[4845]: E1006 07:02:08.751243 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3\": container with ID starting with a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3 not found: ID does not exist" containerID="a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.751268 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3"} err="failed to get container status \"a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3\": rpc error: code = NotFound desc = could not find container \"a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3\": container with ID starting with a58a930bec4a730dd14092fefca2ff5a8d2eeca575bdc188173e0cccceab34c3 not found: ID does not exist" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.751296 4845 scope.go:117] "RemoveContainer" containerID="9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8" Oct 06 07:02:08 crc kubenswrapper[4845]: E1006 07:02:08.767552 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8\": container with ID starting with 9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8 not found: ID does not exist" containerID="9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.767589 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8"} err="failed to get container status \"9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8\": rpc error: code = NotFound desc = could not find container \"9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8\": container with ID starting with 9ca3af92d2dfa47212e94289cb6b863e33b151e789936337d18646a478ee96c8 not found: ID does not exist" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.771943 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.782554 4845 generic.go:334] "Generic (PLEG): container finished" podID="7e6e1657-fd8e-4dd4-b3be-166870ab020a" containerID="04d5afdbea5e58792aa4d2401c5f7eb9f446eb9bc74726a628bd7d0469805e10" exitCode=0 Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.782730 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5574b86b4d-dzxdt" event={"ID":"7e6e1657-fd8e-4dd4-b3be-166870ab020a","Type":"ContainerDied","Data":"04d5afdbea5e58792aa4d2401c5f7eb9f446eb9bc74726a628bd7d0469805e10"} Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.882586 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c8a9f74-3441-4e68-8014-08472abf6680-etc-machine-id\") pod \"8c8a9f74-3441-4e68-8014-08472abf6680\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.882689 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-db-sync-config-data\") pod \"8c8a9f74-3441-4e68-8014-08472abf6680\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.882725 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-scripts\") pod \"8c8a9f74-3441-4e68-8014-08472abf6680\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.882768 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcffr\" (UniqueName: \"kubernetes.io/projected/8c8a9f74-3441-4e68-8014-08472abf6680-kube-api-access-bcffr\") pod \"8c8a9f74-3441-4e68-8014-08472abf6680\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.882808 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-combined-ca-bundle\") pod \"8c8a9f74-3441-4e68-8014-08472abf6680\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.882850 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-config-data\") pod \"8c8a9f74-3441-4e68-8014-08472abf6680\" (UID: \"8c8a9f74-3441-4e68-8014-08472abf6680\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.886480 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c8a9f74-3441-4e68-8014-08472abf6680-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8c8a9f74-3441-4e68-8014-08472abf6680" (UID: "8c8a9f74-3441-4e68-8014-08472abf6680"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.898539 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8c8a9f74-3441-4e68-8014-08472abf6680" (UID: "8c8a9f74-3441-4e68-8014-08472abf6680"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.898843 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8a9f74-3441-4e68-8014-08472abf6680-kube-api-access-bcffr" (OuterVolumeSpecName: "kube-api-access-bcffr") pod "8c8a9f74-3441-4e68-8014-08472abf6680" (UID: "8c8a9f74-3441-4e68-8014-08472abf6680"). InnerVolumeSpecName "kube-api-access-bcffr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.899003 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-scripts" (OuterVolumeSpecName: "scripts") pod "8c8a9f74-3441-4e68-8014-08472abf6680" (UID: "8c8a9f74-3441-4e68-8014-08472abf6680"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.909057 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.920493 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c8a9f74-3441-4e68-8014-08472abf6680" (UID: "8c8a9f74-3441-4e68-8014-08472abf6680"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.984193 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-scripts\") pod \"17a717dd-d883-477e-9894-653d2281a966\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.984243 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17a717dd-d883-477e-9894-653d2281a966-run-httpd\") pod \"17a717dd-d883-477e-9894-653d2281a966\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.984326 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-combined-ca-bundle\") pod \"17a717dd-d883-477e-9894-653d2281a966\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.984350 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l6x7\" (UniqueName: \"kubernetes.io/projected/17a717dd-d883-477e-9894-653d2281a966-kube-api-access-9l6x7\") pod \"17a717dd-d883-477e-9894-653d2281a966\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.984470 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-config-data\") pod \"17a717dd-d883-477e-9894-653d2281a966\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.984491 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-sg-core-conf-yaml\") pod \"17a717dd-d883-477e-9894-653d2281a966\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.984526 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-config-data" (OuterVolumeSpecName: "config-data") pod "8c8a9f74-3441-4e68-8014-08472abf6680" (UID: "8c8a9f74-3441-4e68-8014-08472abf6680"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.984619 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17a717dd-d883-477e-9894-653d2281a966-log-httpd\") pod \"17a717dd-d883-477e-9894-653d2281a966\" (UID: \"17a717dd-d883-477e-9894-653d2281a966\") " Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.984985 4845 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.984997 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.985006 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcffr\" (UniqueName: \"kubernetes.io/projected/8c8a9f74-3441-4e68-8014-08472abf6680-kube-api-access-bcffr\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.985017 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.985026 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8a9f74-3441-4e68-8014-08472abf6680-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.985035 4845 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c8a9f74-3441-4e68-8014-08472abf6680-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.985383 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17a717dd-d883-477e-9894-653d2281a966-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "17a717dd-d883-477e-9894-653d2281a966" (UID: "17a717dd-d883-477e-9894-653d2281a966"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.986692 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17a717dd-d883-477e-9894-653d2281a966-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "17a717dd-d883-477e-9894-653d2281a966" (UID: "17a717dd-d883-477e-9894-653d2281a966"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.991453 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "17a717dd-d883-477e-9894-653d2281a966" (UID: "17a717dd-d883-477e-9894-653d2281a966"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:08 crc kubenswrapper[4845]: I1006 07:02:08.996416 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a717dd-d883-477e-9894-653d2281a966-kube-api-access-9l6x7" (OuterVolumeSpecName: "kube-api-access-9l6x7") pod "17a717dd-d883-477e-9894-653d2281a966" (UID: "17a717dd-d883-477e-9894-653d2281a966"). InnerVolumeSpecName "kube-api-access-9l6x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.002487 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-scripts" (OuterVolumeSpecName: "scripts") pod "17a717dd-d883-477e-9894-653d2281a966" (UID: "17a717dd-d883-477e-9894-653d2281a966"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.097692 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.097731 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17a717dd-d883-477e-9894-653d2281a966-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.097743 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.097755 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17a717dd-d883-477e-9894-653d2281a966-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.097765 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l6x7\" (UniqueName: \"kubernetes.io/projected/17a717dd-d883-477e-9894-653d2281a966-kube-api-access-9l6x7\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.099768 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.100050 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-config-data" (OuterVolumeSpecName: "config-data") pod "17a717dd-d883-477e-9894-653d2281a966" (UID: "17a717dd-d883-477e-9894-653d2281a966"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.142595 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17a717dd-d883-477e-9894-653d2281a966" (UID: "17a717dd-d883-477e-9894-653d2281a966"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.174675 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zdqjb"] Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.199630 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.199662 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a717dd-d883-477e-9894-653d2281a966-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.248276 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-2ct4d"] Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.255900 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4g7bh"] Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.300886 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-ovndb-tls-certs\") pod \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.300971 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-config\") pod \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.301048 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-httpd-config\") pod \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.301179 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcxvl\" (UniqueName: \"kubernetes.io/projected/7e6e1657-fd8e-4dd4-b3be-166870ab020a-kube-api-access-hcxvl\") pod \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.301226 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-combined-ca-bundle\") pod \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\" (UID: \"7e6e1657-fd8e-4dd4-b3be-166870ab020a\") " Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.306708 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7e6e1657-fd8e-4dd4-b3be-166870ab020a" (UID: "7e6e1657-fd8e-4dd4-b3be-166870ab020a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.310739 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6e1657-fd8e-4dd4-b3be-166870ab020a-kube-api-access-hcxvl" (OuterVolumeSpecName: "kube-api-access-hcxvl") pod "7e6e1657-fd8e-4dd4-b3be-166870ab020a" (UID: "7e6e1657-fd8e-4dd4-b3be-166870ab020a"). InnerVolumeSpecName "kube-api-access-hcxvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.404937 4845 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.405296 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcxvl\" (UniqueName: \"kubernetes.io/projected/7e6e1657-fd8e-4dd4-b3be-166870ab020a-kube-api-access-hcxvl\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.462018 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-config" (OuterVolumeSpecName: "config") pod "7e6e1657-fd8e-4dd4-b3be-166870ab020a" (UID: "7e6e1657-fd8e-4dd4-b3be-166870ab020a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.500715 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e6e1657-fd8e-4dd4-b3be-166870ab020a" (UID: "7e6e1657-fd8e-4dd4-b3be-166870ab020a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.510496 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.510527 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.519087 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7e6e1657-fd8e-4dd4-b3be-166870ab020a" (UID: "7e6e1657-fd8e-4dd4-b3be-166870ab020a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.612901 4845 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e6e1657-fd8e-4dd4-b3be-166870ab020a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.794502 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17a717dd-d883-477e-9894-653d2281a966","Type":"ContainerDied","Data":"8d1aed851da3395288e193f555f233b1ee279c4059f6c16e428b407ada6c794f"} Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.794558 4845 scope.go:117] "RemoveContainer" containerID="31c997e3e14c26082ad488cbe1050af4960dc495e806a905789d6740cdf97478" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.794656 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.806592 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4g7bh" event={"ID":"d4713c22-fab5-490b-b167-abae4ecfca11","Type":"ContainerStarted","Data":"8727d7d487d095e6cab6601d2e153a1c869bbc6a443e26047b1e7773ff9f300f"} Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.806664 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4g7bh" event={"ID":"d4713c22-fab5-490b-b167-abae4ecfca11","Type":"ContainerStarted","Data":"f41536c64fae359d4fe3f12d4f5f90a059de137d50415f3cce002ac7051aac8a"} Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.809600 4845 generic.go:334] "Generic (PLEG): container finished" podID="4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd" containerID="5bf0d22cbdc527c5b53ef8b0f961510fada037f062a19e2ecee78bf35ab0004e" exitCode=0 Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.809716 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2ct4d" event={"ID":"4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd","Type":"ContainerDied","Data":"5bf0d22cbdc527c5b53ef8b0f961510fada037f062a19e2ecee78bf35ab0004e"} Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.809741 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2ct4d" event={"ID":"4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd","Type":"ContainerStarted","Data":"392f530419ce16b40e9559593e0972334b6ad2ab7495e9f81d868e5782e15408"} Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.816979 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ec118969-bd05-449c-bb6b-a460bda1b79a","Type":"ContainerStarted","Data":"8d646603fcb7db59663ddf2e24ac3f450774505fa1b3aae94c0d1a5d69031140"} Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.821813 4845 generic.go:334] "Generic (PLEG): container finished" podID="168d541f-1e89-45b6-9817-91a8242a44fd" containerID="a7f94e7944128eae6ea27e0cf374469d4e8d970eee1656f60998e0892c4c6f08" exitCode=0 Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.821949 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zdqjb" event={"ID":"168d541f-1e89-45b6-9817-91a8242a44fd","Type":"ContainerDied","Data":"a7f94e7944128eae6ea27e0cf374469d4e8d970eee1656f60998e0892c4c6f08"} Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.822031 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zdqjb" event={"ID":"168d541f-1e89-45b6-9817-91a8242a44fd","Type":"ContainerStarted","Data":"0f1b0d9c9f30119cc773816e772b7343e18cae3f4d0edd59d66ac7c53b07974e"} Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.825699 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5574b86b4d-dzxdt" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.825698 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5574b86b4d-dzxdt" event={"ID":"7e6e1657-fd8e-4dd4-b3be-166870ab020a","Type":"ContainerDied","Data":"8161852c4b9a44a55dfb35197ca62721baa0d33100a3c692bc5fad6557a7a68a"} Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.826518 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4wcpl" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.830088 4845 scope.go:117] "RemoveContainer" containerID="e0ab01dd95166993806b5271a47be68edd03b7ed0e90cb14a391ab1788827a98" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.905068 4845 scope.go:117] "RemoveContainer" containerID="2167885e23ddb5802aae3cde92cf7e5c5b9a70e5913c3ca2fd4df7d58d433565" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.915823 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.978234 4845 scope.go:117] "RemoveContainer" containerID="9b9093e8cd2362231534c1a487b1e59a14405f7b8e314cccfb65cf7cebcbc1ce" Oct 06 07:02:09 crc kubenswrapper[4845]: I1006 07:02:09.981337 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.021021 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:10 crc kubenswrapper[4845]: E1006 07:02:10.022787 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a717dd-d883-477e-9894-653d2281a966" containerName="ceilometer-central-agent" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.022811 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a717dd-d883-477e-9894-653d2281a966" containerName="ceilometer-central-agent" Oct 06 07:02:10 crc kubenswrapper[4845]: E1006 07:02:10.022829 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6e1657-fd8e-4dd4-b3be-166870ab020a" containerName="neutron-httpd" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.022835 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6e1657-fd8e-4dd4-b3be-166870ab020a" containerName="neutron-httpd" Oct 06 07:02:10 crc kubenswrapper[4845]: E1006 07:02:10.022847 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8a9f74-3441-4e68-8014-08472abf6680" containerName="cinder-db-sync" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.022854 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8a9f74-3441-4e68-8014-08472abf6680" containerName="cinder-db-sync" Oct 06 07:02:10 crc kubenswrapper[4845]: E1006 07:02:10.022865 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a717dd-d883-477e-9894-653d2281a966" containerName="proxy-httpd" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.022870 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a717dd-d883-477e-9894-653d2281a966" containerName="proxy-httpd" Oct 06 07:02:10 crc kubenswrapper[4845]: E1006 07:02:10.022888 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6e1657-fd8e-4dd4-b3be-166870ab020a" containerName="neutron-api" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.022894 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6e1657-fd8e-4dd4-b3be-166870ab020a" containerName="neutron-api" Oct 06 07:02:10 crc kubenswrapper[4845]: E1006 07:02:10.022917 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a717dd-d883-477e-9894-653d2281a966" containerName="ceilometer-notification-agent" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.022922 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a717dd-d883-477e-9894-653d2281a966" containerName="ceilometer-notification-agent" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.023114 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6e1657-fd8e-4dd4-b3be-166870ab020a" containerName="neutron-api" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.023129 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a717dd-d883-477e-9894-653d2281a966" containerName="proxy-httpd" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.023138 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8a9f74-3441-4e68-8014-08472abf6680" containerName="cinder-db-sync" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.023157 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a717dd-d883-477e-9894-653d2281a966" containerName="ceilometer-notification-agent" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.023164 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6e1657-fd8e-4dd4-b3be-166870ab020a" containerName="neutron-httpd" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.023171 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a717dd-d883-477e-9894-653d2281a966" containerName="ceilometer-central-agent" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.030008 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.031530 4845 scope.go:117] "RemoveContainer" containerID="04d5afdbea5e58792aa4d2401c5f7eb9f446eb9bc74726a628bd7d0469805e10" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.032327 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.036115 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.095013 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.103938 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-4g7bh" podStartSLOduration=8.103916322 podStartE2EDuration="8.103916322s" podCreationTimestamp="2025-10-06 07:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:02:09.91250605 +0000 UTC m=+1014.427247058" watchObservedRunningTime="2025-10-06 07:02:10.103916322 +0000 UTC m=+1014.618657330" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.134917 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.9138216569999997 podStartE2EDuration="15.134896118s" podCreationTimestamp="2025-10-06 07:01:55 +0000 UTC" firstStartedPulling="2025-10-06 07:01:56.332262691 +0000 UTC m=+1000.847003699" lastFinishedPulling="2025-10-06 07:02:08.553337152 +0000 UTC m=+1013.068078160" observedRunningTime="2025-10-06 07:02:09.941975328 +0000 UTC m=+1014.456716336" watchObservedRunningTime="2025-10-06 07:02:10.134896118 +0000 UTC m=+1014.649637136" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.154014 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5574b86b4d-dzxdt"] Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.169242 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5574b86b4d-dzxdt"] Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.200000 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.202131 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.204901 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.205116 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.205359 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.208817 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mznpm" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.214584 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.230996 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-run-httpd\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.231073 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgr7j\" (UniqueName: \"kubernetes.io/projected/6cd77935-7853-4e79-becd-771dfdb9cd4f-kube-api-access-pgr7j\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.231151 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-scripts\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.231175 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7ggl\" (UniqueName: \"kubernetes.io/projected/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-kube-api-access-s7ggl\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.231207 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-log-httpd\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.231233 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-scripts\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.231263 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.231299 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.231321 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cd77935-7853-4e79-becd-771dfdb9cd4f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.231358 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.231421 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-config-data\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.231448 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.231477 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-config-data\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.253586 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a717dd-d883-477e-9894-653d2281a966" path="/var/lib/kubelet/pods/17a717dd-d883-477e-9894-653d2281a966/volumes" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.258921 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6e1657-fd8e-4dd4-b3be-166870ab020a" path="/var/lib/kubelet/pods/7e6e1657-fd8e-4dd4-b3be-166870ab020a/volumes" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.263065 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66c766bfdf-qf7h6"] Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.265753 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66c766bfdf-qf7h6"] Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.265794 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.266300 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.267581 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.270694 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.277577 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333361 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-log-httpd\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333438 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-dns-svc\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333468 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-scripts\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333488 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-ovsdbserver-nb\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333512 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-scripts\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333543 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333566 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-config-data-custom\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333599 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333624 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cd77935-7853-4e79-becd-771dfdb9cd4f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333649 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-config\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333721 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-dns-swift-storage-0\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333745 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333773 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333814 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d36d6645-0c4c-431d-82fc-9fc744549046-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333844 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-config-data\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333867 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mdzs\" (UniqueName: \"kubernetes.io/projected/967c80f8-047c-4c0a-a81f-25b6741caf0a-kube-api-access-2mdzs\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333894 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333916 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd5pg\" (UniqueName: \"kubernetes.io/projected/d36d6645-0c4c-431d-82fc-9fc744549046-kube-api-access-nd5pg\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333943 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-config-data\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333967 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-run-httpd\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.333999 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-ovsdbserver-sb\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.334025 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-config-data\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.334069 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgr7j\" (UniqueName: \"kubernetes.io/projected/6cd77935-7853-4e79-becd-771dfdb9cd4f-kube-api-access-pgr7j\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.334361 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-scripts\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.334406 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7ggl\" (UniqueName: \"kubernetes.io/projected/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-kube-api-access-s7ggl\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.334435 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d36d6645-0c4c-431d-82fc-9fc744549046-logs\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.334575 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cd77935-7853-4e79-becd-771dfdb9cd4f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.336694 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-run-httpd\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.337664 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-log-httpd\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.343952 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.346148 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-scripts\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.346635 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.346791 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-config-data\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.346935 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.347128 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-config-data\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.347139 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.347496 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-scripts\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.350458 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgr7j\" (UniqueName: \"kubernetes.io/projected/6cd77935-7853-4e79-becd-771dfdb9cd4f-kube-api-access-pgr7j\") pod \"cinder-scheduler-0\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.351291 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7ggl\" (UniqueName: \"kubernetes.io/projected/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-kube-api-access-s7ggl\") pod \"ceilometer-0\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.381929 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.440544 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd5pg\" (UniqueName: \"kubernetes.io/projected/d36d6645-0c4c-431d-82fc-9fc744549046-kube-api-access-nd5pg\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.441166 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-ovsdbserver-sb\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.442351 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-ovsdbserver-sb\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.442476 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-config-data\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.451911 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d36d6645-0c4c-431d-82fc-9fc744549046-logs\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.452156 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-dns-svc\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.452241 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-ovsdbserver-nb\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.452310 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-scripts\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.452471 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-config-data-custom\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.452636 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-config\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.452739 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-dns-swift-storage-0\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.452906 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.453101 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d36d6645-0c4c-431d-82fc-9fc744549046-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.453226 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mdzs\" (UniqueName: \"kubernetes.io/projected/967c80f8-047c-4c0a-a81f-25b6741caf0a-kube-api-access-2mdzs\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.453721 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d36d6645-0c4c-431d-82fc-9fc744549046-logs\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.453839 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d36d6645-0c4c-431d-82fc-9fc744549046-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.453946 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-dns-svc\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.454394 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-config\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.457847 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-dns-swift-storage-0\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.459058 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-scripts\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.459138 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-config-data-custom\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.459078 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-ovsdbserver-nb\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.464011 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd5pg\" (UniqueName: \"kubernetes.io/projected/d36d6645-0c4c-431d-82fc-9fc744549046-kube-api-access-nd5pg\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.464595 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.465666 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-config-data\") pod \"cinder-api-0\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.472897 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mdzs\" (UniqueName: \"kubernetes.io/projected/967c80f8-047c-4c0a-a81f-25b6741caf0a-kube-api-access-2mdzs\") pod \"dnsmasq-dns-66c766bfdf-qf7h6\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.526086 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.611870 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.621984 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.782717 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.784652 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-865c95569-jxblm" Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.856606 4845 generic.go:334] "Generic (PLEG): container finished" podID="d4713c22-fab5-490b-b167-abae4ecfca11" containerID="8727d7d487d095e6cab6601d2e153a1c869bbc6a443e26047b1e7773ff9f300f" exitCode=0 Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.857106 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4g7bh" event={"ID":"d4713c22-fab5-490b-b167-abae4ecfca11","Type":"ContainerDied","Data":"8727d7d487d095e6cab6601d2e153a1c869bbc6a443e26047b1e7773ff9f300f"} Oct 06 07:02:10 crc kubenswrapper[4845]: I1006 07:02:10.930199 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:11 crc kubenswrapper[4845]: W1006 07:02:11.057532 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cd77935_7853_4e79_becd_771dfdb9cd4f.slice/crio-327c050635a1e6912f0f8d98b3e339d9d467d0a35e1a19b8ba543640d85532f5 WatchSource:0}: Error finding container 327c050635a1e6912f0f8d98b3e339d9d467d0a35e1a19b8ba543640d85532f5: Status 404 returned error can't find the container with id 327c050635a1e6912f0f8d98b3e339d9d467d0a35e1a19b8ba543640d85532f5 Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.057964 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.421419 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.441061 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66c766bfdf-qf7h6"] Oct 06 07:02:11 crc kubenswrapper[4845]: W1006 07:02:11.486900 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod967c80f8_047c_4c0a_a81f_25b6741caf0a.slice/crio-05016e4f1893d7e87e3cf77c6dfe80abee78a39d5e6c10d66414a86b723799d9 WatchSource:0}: Error finding container 05016e4f1893d7e87e3cf77c6dfe80abee78a39d5e6c10d66414a86b723799d9: Status 404 returned error can't find the container with id 05016e4f1893d7e87e3cf77c6dfe80abee78a39d5e6c10d66414a86b723799d9 Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.508222 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.721754 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2ct4d" Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.741525 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zdqjb" Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.789259 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk5xl\" (UniqueName: \"kubernetes.io/projected/168d541f-1e89-45b6-9817-91a8242a44fd-kube-api-access-vk5xl\") pod \"168d541f-1e89-45b6-9817-91a8242a44fd\" (UID: \"168d541f-1e89-45b6-9817-91a8242a44fd\") " Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.789757 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7xwb\" (UniqueName: \"kubernetes.io/projected/4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd-kube-api-access-h7xwb\") pod \"4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd\" (UID: \"4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd\") " Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.799550 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168d541f-1e89-45b6-9817-91a8242a44fd-kube-api-access-vk5xl" (OuterVolumeSpecName: "kube-api-access-vk5xl") pod "168d541f-1e89-45b6-9817-91a8242a44fd" (UID: "168d541f-1e89-45b6-9817-91a8242a44fd"). InnerVolumeSpecName "kube-api-access-vk5xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.800826 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd-kube-api-access-h7xwb" (OuterVolumeSpecName: "kube-api-access-h7xwb") pod "4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd" (UID: "4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd"). InnerVolumeSpecName "kube-api-access-h7xwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.891022 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" event={"ID":"967c80f8-047c-4c0a-a81f-25b6741caf0a","Type":"ContainerStarted","Data":"c9d77cc51870e479df0ac073de335e1c640abf9e4f1ca59f987fad57c6168349"} Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.891067 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" event={"ID":"967c80f8-047c-4c0a-a81f-25b6741caf0a","Type":"ContainerStarted","Data":"05016e4f1893d7e87e3cf77c6dfe80abee78a39d5e6c10d66414a86b723799d9"} Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.893323 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7xwb\" (UniqueName: \"kubernetes.io/projected/4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd-kube-api-access-h7xwb\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.893348 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk5xl\" (UniqueName: \"kubernetes.io/projected/168d541f-1e89-45b6-9817-91a8242a44fd-kube-api-access-vk5xl\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.894884 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6cd77935-7853-4e79-becd-771dfdb9cd4f","Type":"ContainerStarted","Data":"327c050635a1e6912f0f8d98b3e339d9d467d0a35e1a19b8ba543640d85532f5"} Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.896395 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c","Type":"ContainerStarted","Data":"cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f"} Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.896418 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c","Type":"ContainerStarted","Data":"de87d91c15e08fcaded21356601aa9c891f2bbaee7118f3ff726662389e98e47"} Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.906855 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2ct4d" event={"ID":"4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd","Type":"ContainerDied","Data":"392f530419ce16b40e9559593e0972334b6ad2ab7495e9f81d868e5782e15408"} Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.906896 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="392f530419ce16b40e9559593e0972334b6ad2ab7495e9f81d868e5782e15408" Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.906949 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2ct4d" Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.918337 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zdqjb" event={"ID":"168d541f-1e89-45b6-9817-91a8242a44fd","Type":"ContainerDied","Data":"0f1b0d9c9f30119cc773816e772b7343e18cae3f4d0edd59d66ac7c53b07974e"} Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.918405 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f1b0d9c9f30119cc773816e772b7343e18cae3f4d0edd59d66ac7c53b07974e" Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.918592 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zdqjb" Oct 06 07:02:11 crc kubenswrapper[4845]: I1006 07:02:11.920802 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d36d6645-0c4c-431d-82fc-9fc744549046","Type":"ContainerStarted","Data":"c1f6db4023692f2209005a4e6f95ab595da924d3fd0a494c649ead6668580f86"} Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.037497 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.315079 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4g7bh" Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.402142 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75fb6\" (UniqueName: \"kubernetes.io/projected/d4713c22-fab5-490b-b167-abae4ecfca11-kube-api-access-75fb6\") pod \"d4713c22-fab5-490b-b167-abae4ecfca11\" (UID: \"d4713c22-fab5-490b-b167-abae4ecfca11\") " Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.406297 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4713c22-fab5-490b-b167-abae4ecfca11-kube-api-access-75fb6" (OuterVolumeSpecName: "kube-api-access-75fb6") pod "d4713c22-fab5-490b-b167-abae4ecfca11" (UID: "d4713c22-fab5-490b-b167-abae4ecfca11"). InnerVolumeSpecName "kube-api-access-75fb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.504436 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75fb6\" (UniqueName: \"kubernetes.io/projected/d4713c22-fab5-490b-b167-abae4ecfca11-kube-api-access-75fb6\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.941327 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6cd77935-7853-4e79-becd-771dfdb9cd4f","Type":"ContainerStarted","Data":"b3e1de9452bb05a34a9b45fa770610e273a0586ec4a41b4aa51c96304a7f14bb"} Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.944605 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c","Type":"ContainerStarted","Data":"bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d"} Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.946522 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4g7bh" event={"ID":"d4713c22-fab5-490b-b167-abae4ecfca11","Type":"ContainerDied","Data":"f41536c64fae359d4fe3f12d4f5f90a059de137d50415f3cce002ac7051aac8a"} Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.946545 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f41536c64fae359d4fe3f12d4f5f90a059de137d50415f3cce002ac7051aac8a" Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.946595 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4g7bh" Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.957636 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d36d6645-0c4c-431d-82fc-9fc744549046","Type":"ContainerStarted","Data":"7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d"} Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.970485 4845 generic.go:334] "Generic (PLEG): container finished" podID="967c80f8-047c-4c0a-a81f-25b6741caf0a" containerID="c9d77cc51870e479df0ac073de335e1c640abf9e4f1ca59f987fad57c6168349" exitCode=0 Oct 06 07:02:12 crc kubenswrapper[4845]: I1006 07:02:12.970528 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" event={"ID":"967c80f8-047c-4c0a-a81f-25b6741caf0a","Type":"ContainerDied","Data":"c9d77cc51870e479df0ac073de335e1c640abf9e4f1ca59f987fad57c6168349"} Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.023030 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c7b0-account-create-ztfjf"] Oct 06 07:02:13 crc kubenswrapper[4845]: E1006 07:02:13.023500 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd" containerName="mariadb-database-create" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.023517 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd" containerName="mariadb-database-create" Oct 06 07:02:13 crc kubenswrapper[4845]: E1006 07:02:13.023535 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168d541f-1e89-45b6-9817-91a8242a44fd" containerName="mariadb-database-create" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.023541 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="168d541f-1e89-45b6-9817-91a8242a44fd" containerName="mariadb-database-create" Oct 06 07:02:13 crc kubenswrapper[4845]: E1006 07:02:13.023570 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4713c22-fab5-490b-b167-abae4ecfca11" containerName="mariadb-database-create" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.023576 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4713c22-fab5-490b-b167-abae4ecfca11" containerName="mariadb-database-create" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.023751 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd" containerName="mariadb-database-create" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.023774 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4713c22-fab5-490b-b167-abae4ecfca11" containerName="mariadb-database-create" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.023784 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="168d541f-1e89-45b6-9817-91a8242a44fd" containerName="mariadb-database-create" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.024754 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c7b0-account-create-ztfjf" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.042639 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.053629 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c7b0-account-create-ztfjf"] Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.116094 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k5mb\" (UniqueName: \"kubernetes.io/projected/3985a14b-5690-43fb-98ed-b6bd2fe9ea1a-kube-api-access-2k5mb\") pod \"nova-cell1-c7b0-account-create-ztfjf\" (UID: \"3985a14b-5690-43fb-98ed-b6bd2fe9ea1a\") " pod="openstack/nova-cell1-c7b0-account-create-ztfjf" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.217984 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k5mb\" (UniqueName: \"kubernetes.io/projected/3985a14b-5690-43fb-98ed-b6bd2fe9ea1a-kube-api-access-2k5mb\") pod \"nova-cell1-c7b0-account-create-ztfjf\" (UID: \"3985a14b-5690-43fb-98ed-b6bd2fe9ea1a\") " pod="openstack/nova-cell1-c7b0-account-create-ztfjf" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.241174 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k5mb\" (UniqueName: \"kubernetes.io/projected/3985a14b-5690-43fb-98ed-b6bd2fe9ea1a-kube-api-access-2k5mb\") pod \"nova-cell1-c7b0-account-create-ztfjf\" (UID: \"3985a14b-5690-43fb-98ed-b6bd2fe9ea1a\") " pod="openstack/nova-cell1-c7b0-account-create-ztfjf" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.373440 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c7b0-account-create-ztfjf" Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.874083 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c7b0-account-create-ztfjf"] Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.989149 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c","Type":"ContainerStarted","Data":"6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902"} Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.992324 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d36d6645-0c4c-431d-82fc-9fc744549046","Type":"ContainerStarted","Data":"1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9"} Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.992501 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d36d6645-0c4c-431d-82fc-9fc744549046" containerName="cinder-api-log" containerID="cri-o://7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d" gracePeriod=30 Oct 06 07:02:13 crc kubenswrapper[4845]: I1006 07:02:13.992866 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d36d6645-0c4c-431d-82fc-9fc744549046" containerName="cinder-api" containerID="cri-o://1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9" gracePeriod=30 Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.000432 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" event={"ID":"967c80f8-047c-4c0a-a81f-25b6741caf0a","Type":"ContainerStarted","Data":"c5b80cc68c9a4f10f90feb1440ad39a8b9cb5d60226fbc79ace467f7f8c0b17a"} Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.000504 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.003668 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6cd77935-7853-4e79-becd-771dfdb9cd4f","Type":"ContainerStarted","Data":"8088f5dfd5e83623d6837fc3ac0103337a8873d01801808a21111cb9036ff499"} Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.005899 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c7b0-account-create-ztfjf" event={"ID":"3985a14b-5690-43fb-98ed-b6bd2fe9ea1a","Type":"ContainerStarted","Data":"ac346404f705e2bc2f814660f65e0748a5dfc7a642a81effd0f28096752d1adb"} Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.015265 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.015248201 podStartE2EDuration="4.015248201s" podCreationTimestamp="2025-10-06 07:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:02:14.009448212 +0000 UTC m=+1018.524189220" watchObservedRunningTime="2025-10-06 07:02:14.015248201 +0000 UTC m=+1018.529989209" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.047064 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.625006468 podStartE2EDuration="4.047036959s" podCreationTimestamp="2025-10-06 07:02:10 +0000 UTC" firstStartedPulling="2025-10-06 07:02:11.073882342 +0000 UTC m=+1015.588623350" lastFinishedPulling="2025-10-06 07:02:11.495912833 +0000 UTC m=+1016.010653841" observedRunningTime="2025-10-06 07:02:14.040718796 +0000 UTC m=+1018.555459804" watchObservedRunningTime="2025-10-06 07:02:14.047036959 +0000 UTC m=+1018.561777967" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.063731 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" podStartSLOduration=4.063710357 podStartE2EDuration="4.063710357s" podCreationTimestamp="2025-10-06 07:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:02:14.060240548 +0000 UTC m=+1018.574981556" watchObservedRunningTime="2025-10-06 07:02:14.063710357 +0000 UTC m=+1018.578451365" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.719411 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.856657 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-scripts\") pod \"d36d6645-0c4c-431d-82fc-9fc744549046\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.856894 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-config-data\") pod \"d36d6645-0c4c-431d-82fc-9fc744549046\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.856974 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d36d6645-0c4c-431d-82fc-9fc744549046-logs\") pod \"d36d6645-0c4c-431d-82fc-9fc744549046\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.856995 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd5pg\" (UniqueName: \"kubernetes.io/projected/d36d6645-0c4c-431d-82fc-9fc744549046-kube-api-access-nd5pg\") pod \"d36d6645-0c4c-431d-82fc-9fc744549046\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.857056 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-combined-ca-bundle\") pod \"d36d6645-0c4c-431d-82fc-9fc744549046\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.857108 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-config-data-custom\") pod \"d36d6645-0c4c-431d-82fc-9fc744549046\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.857219 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d36d6645-0c4c-431d-82fc-9fc744549046-etc-machine-id\") pod \"d36d6645-0c4c-431d-82fc-9fc744549046\" (UID: \"d36d6645-0c4c-431d-82fc-9fc744549046\") " Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.857674 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d36d6645-0c4c-431d-82fc-9fc744549046-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d36d6645-0c4c-431d-82fc-9fc744549046" (UID: "d36d6645-0c4c-431d-82fc-9fc744549046"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.857860 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36d6645-0c4c-431d-82fc-9fc744549046-logs" (OuterVolumeSpecName: "logs") pod "d36d6645-0c4c-431d-82fc-9fc744549046" (UID: "d36d6645-0c4c-431d-82fc-9fc744549046"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.877546 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d36d6645-0c4c-431d-82fc-9fc744549046" (UID: "d36d6645-0c4c-431d-82fc-9fc744549046"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.877627 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-scripts" (OuterVolumeSpecName: "scripts") pod "d36d6645-0c4c-431d-82fc-9fc744549046" (UID: "d36d6645-0c4c-431d-82fc-9fc744549046"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.887575 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36d6645-0c4c-431d-82fc-9fc744549046-kube-api-access-nd5pg" (OuterVolumeSpecName: "kube-api-access-nd5pg") pod "d36d6645-0c4c-431d-82fc-9fc744549046" (UID: "d36d6645-0c4c-431d-82fc-9fc744549046"). InnerVolumeSpecName "kube-api-access-nd5pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.918486 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-config-data" (OuterVolumeSpecName: "config-data") pod "d36d6645-0c4c-431d-82fc-9fc744549046" (UID: "d36d6645-0c4c-431d-82fc-9fc744549046"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.940597 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d36d6645-0c4c-431d-82fc-9fc744549046" (UID: "d36d6645-0c4c-431d-82fc-9fc744549046"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.959294 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.959535 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.959615 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd5pg\" (UniqueName: \"kubernetes.io/projected/d36d6645-0c4c-431d-82fc-9fc744549046-kube-api-access-nd5pg\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.959671 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d36d6645-0c4c-431d-82fc-9fc744549046-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.959723 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.959776 4845 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d36d6645-0c4c-431d-82fc-9fc744549046-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:14 crc kubenswrapper[4845]: I1006 07:02:14.959833 4845 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d36d6645-0c4c-431d-82fc-9fc744549046-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.017899 4845 generic.go:334] "Generic (PLEG): container finished" podID="3985a14b-5690-43fb-98ed-b6bd2fe9ea1a" containerID="2765602a1e6060186abbd0599bc9af71cd1731ce86d275d0a87f6bdf40477ca8" exitCode=0 Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.018046 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c7b0-account-create-ztfjf" event={"ID":"3985a14b-5690-43fb-98ed-b6bd2fe9ea1a","Type":"ContainerDied","Data":"2765602a1e6060186abbd0599bc9af71cd1731ce86d275d0a87f6bdf40477ca8"} Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.020726 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c","Type":"ContainerStarted","Data":"d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c"} Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.021120 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="ceilometer-central-agent" containerID="cri-o://cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f" gracePeriod=30 Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.021177 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="proxy-httpd" containerID="cri-o://d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c" gracePeriod=30 Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.021201 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="ceilometer-notification-agent" containerID="cri-o://bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d" gracePeriod=30 Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.021225 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="sg-core" containerID="cri-o://6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902" gracePeriod=30 Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.021595 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.023855 4845 generic.go:334] "Generic (PLEG): container finished" podID="d36d6645-0c4c-431d-82fc-9fc744549046" containerID="1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9" exitCode=0 Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.023897 4845 generic.go:334] "Generic (PLEG): container finished" podID="d36d6645-0c4c-431d-82fc-9fc744549046" containerID="7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d" exitCode=143 Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.023966 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.023966 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d36d6645-0c4c-431d-82fc-9fc744549046","Type":"ContainerDied","Data":"1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9"} Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.024011 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d36d6645-0c4c-431d-82fc-9fc744549046","Type":"ContainerDied","Data":"7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d"} Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.024023 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d36d6645-0c4c-431d-82fc-9fc744549046","Type":"ContainerDied","Data":"c1f6db4023692f2209005a4e6f95ab595da924d3fd0a494c649ead6668580f86"} Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.024048 4845 scope.go:117] "RemoveContainer" containerID="1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.065179 4845 scope.go:117] "RemoveContainer" containerID="7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.090169 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.938038772 podStartE2EDuration="6.09015144s" podCreationTimestamp="2025-10-06 07:02:09 +0000 UTC" firstStartedPulling="2025-10-06 07:02:10.937072434 +0000 UTC m=+1015.451813432" lastFinishedPulling="2025-10-06 07:02:14.089185092 +0000 UTC m=+1018.603926100" observedRunningTime="2025-10-06 07:02:15.067107057 +0000 UTC m=+1019.581848065" watchObservedRunningTime="2025-10-06 07:02:15.09015144 +0000 UTC m=+1019.604892448" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.114119 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.125646 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.126128 4845 scope.go:117] "RemoveContainer" containerID="1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9" Oct 06 07:02:15 crc kubenswrapper[4845]: E1006 07:02:15.127408 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9\": container with ID starting with 1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9 not found: ID does not exist" containerID="1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.127467 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9"} err="failed to get container status \"1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9\": rpc error: code = NotFound desc = could not find container \"1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9\": container with ID starting with 1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9 not found: ID does not exist" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.127501 4845 scope.go:117] "RemoveContainer" containerID="7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d" Oct 06 07:02:15 crc kubenswrapper[4845]: E1006 07:02:15.127922 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d\": container with ID starting with 7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d not found: ID does not exist" containerID="7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.127957 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d"} err="failed to get container status \"7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d\": rpc error: code = NotFound desc = could not find container \"7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d\": container with ID starting with 7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d not found: ID does not exist" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.127986 4845 scope.go:117] "RemoveContainer" containerID="1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.128209 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9"} err="failed to get container status \"1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9\": rpc error: code = NotFound desc = could not find container \"1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9\": container with ID starting with 1659870c9e07ec4a93b630f33d398b3f09c1e14a4334166d45b06f25a69ba1e9 not found: ID does not exist" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.128227 4845 scope.go:117] "RemoveContainer" containerID="7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.129927 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d"} err="failed to get container status \"7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d\": rpc error: code = NotFound desc = could not find container \"7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d\": container with ID starting with 7a010c46293f48a188a0c472d9d6e7ff119dce31c8ed26fd8e9e9bf2df9a8a6d not found: ID does not exist" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.153250 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 07:02:15 crc kubenswrapper[4845]: E1006 07:02:15.155222 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36d6645-0c4c-431d-82fc-9fc744549046" containerName="cinder-api-log" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.155251 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36d6645-0c4c-431d-82fc-9fc744549046" containerName="cinder-api-log" Oct 06 07:02:15 crc kubenswrapper[4845]: E1006 07:02:15.155303 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36d6645-0c4c-431d-82fc-9fc744549046" containerName="cinder-api" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.155313 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36d6645-0c4c-431d-82fc-9fc744549046" containerName="cinder-api" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.155564 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36d6645-0c4c-431d-82fc-9fc744549046" containerName="cinder-api-log" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.155584 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36d6645-0c4c-431d-82fc-9fc744549046" containerName="cinder-api" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.156501 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.160838 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.160979 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.161481 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.168866 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.267147 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blr29\" (UniqueName: \"kubernetes.io/projected/0cc2b546-3f23-4c16-af0c-84cce0997fe9-kube-api-access-blr29\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.267471 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.269056 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-config-data-custom\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.269127 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-scripts\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.269148 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cc2b546-3f23-4c16-af0c-84cce0997fe9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.269163 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.269230 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cc2b546-3f23-4c16-af0c-84cce0997fe9-logs\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.269257 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.269290 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-config-data\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.370869 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.370951 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-config-data\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.371609 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blr29\" (UniqueName: \"kubernetes.io/projected/0cc2b546-3f23-4c16-af0c-84cce0997fe9-kube-api-access-blr29\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.371669 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.371721 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-config-data-custom\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.371755 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-scripts\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.371772 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cc2b546-3f23-4c16-af0c-84cce0997fe9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.371787 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.371835 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cc2b546-3f23-4c16-af0c-84cce0997fe9-logs\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.372105 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cc2b546-3f23-4c16-af0c-84cce0997fe9-logs\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.374504 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cc2b546-3f23-4c16-af0c-84cce0997fe9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.377361 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.377527 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-scripts\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.378082 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-config-data-custom\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.378780 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-config-data\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.378925 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.379985 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2b546-3f23-4c16-af0c-84cce0997fe9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.392776 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blr29\" (UniqueName: \"kubernetes.io/projected/0cc2b546-3f23-4c16-af0c-84cce0997fe9-kube-api-access-blr29\") pod \"cinder-api-0\" (UID: \"0cc2b546-3f23-4c16-af0c-84cce0997fe9\") " pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.476425 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.526844 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 07:02:15 crc kubenswrapper[4845]: I1006 07:02:15.957643 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 07:02:16 crc kubenswrapper[4845]: I1006 07:02:16.037789 4845 generic.go:334] "Generic (PLEG): container finished" podID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerID="d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c" exitCode=0 Oct 06 07:02:16 crc kubenswrapper[4845]: I1006 07:02:16.038108 4845 generic.go:334] "Generic (PLEG): container finished" podID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerID="6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902" exitCode=2 Oct 06 07:02:16 crc kubenswrapper[4845]: I1006 07:02:16.038118 4845 generic.go:334] "Generic (PLEG): container finished" podID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerID="bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d" exitCode=0 Oct 06 07:02:16 crc kubenswrapper[4845]: I1006 07:02:16.038163 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c","Type":"ContainerDied","Data":"d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c"} Oct 06 07:02:16 crc kubenswrapper[4845]: I1006 07:02:16.038195 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c","Type":"ContainerDied","Data":"6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902"} Oct 06 07:02:16 crc kubenswrapper[4845]: I1006 07:02:16.038206 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c","Type":"ContainerDied","Data":"bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d"} Oct 06 07:02:16 crc kubenswrapper[4845]: I1006 07:02:16.041064 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0cc2b546-3f23-4c16-af0c-84cce0997fe9","Type":"ContainerStarted","Data":"df2f8d3b4405860e4e3182f01c3d569628041be496064df479878ff5bc478766"} Oct 06 07:02:16 crc kubenswrapper[4845]: I1006 07:02:16.238595 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36d6645-0c4c-431d-82fc-9fc744549046" path="/var/lib/kubelet/pods/d36d6645-0c4c-431d-82fc-9fc744549046/volumes" Oct 06 07:02:16 crc kubenswrapper[4845]: I1006 07:02:16.293005 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c7b0-account-create-ztfjf" Oct 06 07:02:16 crc kubenswrapper[4845]: I1006 07:02:16.402110 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k5mb\" (UniqueName: \"kubernetes.io/projected/3985a14b-5690-43fb-98ed-b6bd2fe9ea1a-kube-api-access-2k5mb\") pod \"3985a14b-5690-43fb-98ed-b6bd2fe9ea1a\" (UID: \"3985a14b-5690-43fb-98ed-b6bd2fe9ea1a\") " Oct 06 07:02:16 crc kubenswrapper[4845]: I1006 07:02:16.408352 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3985a14b-5690-43fb-98ed-b6bd2fe9ea1a-kube-api-access-2k5mb" (OuterVolumeSpecName: "kube-api-access-2k5mb") pod "3985a14b-5690-43fb-98ed-b6bd2fe9ea1a" (UID: "3985a14b-5690-43fb-98ed-b6bd2fe9ea1a"). InnerVolumeSpecName "kube-api-access-2k5mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:16 crc kubenswrapper[4845]: I1006 07:02:16.504302 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k5mb\" (UniqueName: \"kubernetes.io/projected/3985a14b-5690-43fb-98ed-b6bd2fe9ea1a-kube-api-access-2k5mb\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:17 crc kubenswrapper[4845]: I1006 07:02:17.054546 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c7b0-account-create-ztfjf" Oct 06 07:02:17 crc kubenswrapper[4845]: I1006 07:02:17.054542 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c7b0-account-create-ztfjf" event={"ID":"3985a14b-5690-43fb-98ed-b6bd2fe9ea1a","Type":"ContainerDied","Data":"ac346404f705e2bc2f814660f65e0748a5dfc7a642a81effd0f28096752d1adb"} Oct 06 07:02:17 crc kubenswrapper[4845]: I1006 07:02:17.054710 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac346404f705e2bc2f814660f65e0748a5dfc7a642a81effd0f28096752d1adb" Oct 06 07:02:17 crc kubenswrapper[4845]: I1006 07:02:17.057952 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0cc2b546-3f23-4c16-af0c-84cce0997fe9","Type":"ContainerStarted","Data":"151aceb7ede26227f05ff1919e41c2a49a96b032326699ee577e7f8ad45a16ea"} Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.069099 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0cc2b546-3f23-4c16-af0c-84cce0997fe9","Type":"ContainerStarted","Data":"c1899270d2dd92bb14b77064a2f924d730ba76e5a5b0ed1a8733d8a61f67d00d"} Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.069450 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.094924 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.094899459 podStartE2EDuration="3.094899459s" podCreationTimestamp="2025-10-06 07:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:02:18.084105631 +0000 UTC m=+1022.598846639" watchObservedRunningTime="2025-10-06 07:02:18.094899459 +0000 UTC m=+1022.609640477" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.613459 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.746742 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7ggl\" (UniqueName: \"kubernetes.io/projected/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-kube-api-access-s7ggl\") pod \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.746883 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-run-httpd\") pod \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.746928 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-sg-core-conf-yaml\") pod \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.746964 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-scripts\") pod \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.747057 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-log-httpd\") pod \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.747153 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-combined-ca-bundle\") pod \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.747204 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-config-data\") pod \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\" (UID: \"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c\") " Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.747295 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" (UID: "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.747656 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" (UID: "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.747924 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.747948 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.752830 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-scripts" (OuterVolumeSpecName: "scripts") pod "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" (UID: "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.755402 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-kube-api-access-s7ggl" (OuterVolumeSpecName: "kube-api-access-s7ggl") pod "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" (UID: "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c"). InnerVolumeSpecName "kube-api-access-s7ggl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.795698 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" (UID: "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.841509 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" (UID: "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.849273 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.849308 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.849317 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.849326 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7ggl\" (UniqueName: \"kubernetes.io/projected/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-kube-api-access-s7ggl\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.859813 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-config-data" (OuterVolumeSpecName: "config-data") pod "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" (UID: "7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:18 crc kubenswrapper[4845]: I1006 07:02:18.951769 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.080134 4845 generic.go:334] "Generic (PLEG): container finished" podID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerID="cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f" exitCode=0 Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.080204 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.080237 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c","Type":"ContainerDied","Data":"cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f"} Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.080310 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c","Type":"ContainerDied","Data":"de87d91c15e08fcaded21356601aa9c891f2bbaee7118f3ff726662389e98e47"} Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.080332 4845 scope.go:117] "RemoveContainer" containerID="d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.107016 4845 scope.go:117] "RemoveContainer" containerID="6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.115479 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.125861 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.130586 4845 scope.go:117] "RemoveContainer" containerID="bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.150119 4845 scope.go:117] "RemoveContainer" containerID="cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.154452 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:19 crc kubenswrapper[4845]: E1006 07:02:19.154912 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="ceilometer-notification-agent" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.154937 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="ceilometer-notification-agent" Oct 06 07:02:19 crc kubenswrapper[4845]: E1006 07:02:19.154954 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="sg-core" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.154962 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="sg-core" Oct 06 07:02:19 crc kubenswrapper[4845]: E1006 07:02:19.154975 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="ceilometer-central-agent" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.154983 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="ceilometer-central-agent" Oct 06 07:02:19 crc kubenswrapper[4845]: E1006 07:02:19.154999 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3985a14b-5690-43fb-98ed-b6bd2fe9ea1a" containerName="mariadb-account-create" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.155006 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3985a14b-5690-43fb-98ed-b6bd2fe9ea1a" containerName="mariadb-account-create" Oct 06 07:02:19 crc kubenswrapper[4845]: E1006 07:02:19.155025 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="proxy-httpd" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.155032 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="proxy-httpd" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.155303 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="ceilometer-notification-agent" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.155337 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="ceilometer-central-agent" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.155350 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="sg-core" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.155835 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" containerName="proxy-httpd" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.155860 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3985a14b-5690-43fb-98ed-b6bd2fe9ea1a" containerName="mariadb-account-create" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.158041 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.160745 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.167532 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.185879 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.222751 4845 scope.go:117] "RemoveContainer" containerID="d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c" Oct 06 07:02:19 crc kubenswrapper[4845]: E1006 07:02:19.223281 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c\": container with ID starting with d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c not found: ID does not exist" containerID="d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.223325 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c"} err="failed to get container status \"d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c\": rpc error: code = NotFound desc = could not find container \"d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c\": container with ID starting with d4854a2b6ce010f9c1a6718a5db7bbe6a57e0a036de7b103cbbe48923c3d3e1c not found: ID does not exist" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.223355 4845 scope.go:117] "RemoveContainer" containerID="6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902" Oct 06 07:02:19 crc kubenswrapper[4845]: E1006 07:02:19.223744 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902\": container with ID starting with 6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902 not found: ID does not exist" containerID="6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.223777 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902"} err="failed to get container status \"6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902\": rpc error: code = NotFound desc = could not find container \"6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902\": container with ID starting with 6b721f8e37f42bdd56830d5e0684ffd704231ab2bdb010b2119a8fcc74661902 not found: ID does not exist" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.223821 4845 scope.go:117] "RemoveContainer" containerID="bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d" Oct 06 07:02:19 crc kubenswrapper[4845]: E1006 07:02:19.224041 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d\": container with ID starting with bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d not found: ID does not exist" containerID="bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.224065 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d"} err="failed to get container status \"bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d\": rpc error: code = NotFound desc = could not find container \"bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d\": container with ID starting with bc7f49c65b4ad389167e154bd0992cecfc4a510d71496327423a9b097bf4781d not found: ID does not exist" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.224080 4845 scope.go:117] "RemoveContainer" containerID="cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f" Oct 06 07:02:19 crc kubenswrapper[4845]: E1006 07:02:19.224284 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f\": container with ID starting with cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f not found: ID does not exist" containerID="cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.224321 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f"} err="failed to get container status \"cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f\": rpc error: code = NotFound desc = could not find container \"cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f\": container with ID starting with cd6bdda11796f54b43d79a21ac5f7eafd5cca9c091343ddf3d3d1cc772edd08f not found: ID does not exist" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.256598 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.256692 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-config-data\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.256721 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6711858f-8c8e-49bf-afb8-836bcd2cb545-run-httpd\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.256963 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4xjm\" (UniqueName: \"kubernetes.io/projected/6711858f-8c8e-49bf-afb8-836bcd2cb545-kube-api-access-w4xjm\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.257063 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.257179 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-scripts\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.257242 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6711858f-8c8e-49bf-afb8-836bcd2cb545-log-httpd\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.358921 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6711858f-8c8e-49bf-afb8-836bcd2cb545-log-httpd\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.359044 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.359118 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-config-data\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.359165 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6711858f-8c8e-49bf-afb8-836bcd2cb545-run-httpd\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.359246 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4xjm\" (UniqueName: \"kubernetes.io/projected/6711858f-8c8e-49bf-afb8-836bcd2cb545-kube-api-access-w4xjm\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.359274 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.359317 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-scripts\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.361211 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6711858f-8c8e-49bf-afb8-836bcd2cb545-run-httpd\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.361580 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6711858f-8c8e-49bf-afb8-836bcd2cb545-log-httpd\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.365459 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-config-data\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.365630 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.366104 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.366131 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-scripts\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.385442 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4xjm\" (UniqueName: \"kubernetes.io/projected/6711858f-8c8e-49bf-afb8-836bcd2cb545-kube-api-access-w4xjm\") pod \"ceilometer-0\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.499396 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:19 crc kubenswrapper[4845]: I1006 07:02:19.939838 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:19 crc kubenswrapper[4845]: W1006 07:02:19.952791 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6711858f_8c8e_49bf_afb8_836bcd2cb545.slice/crio-1ef8c4f48bc771c02e72f5b6a49788a8db7fce705a082d91109ce7c502109bb8 WatchSource:0}: Error finding container 1ef8c4f48bc771c02e72f5b6a49788a8db7fce705a082d91109ce7c502109bb8: Status 404 returned error can't find the container with id 1ef8c4f48bc771c02e72f5b6a49788a8db7fce705a082d91109ce7c502109bb8 Oct 06 07:02:20 crc kubenswrapper[4845]: I1006 07:02:20.094286 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6711858f-8c8e-49bf-afb8-836bcd2cb545","Type":"ContainerStarted","Data":"1ef8c4f48bc771c02e72f5b6a49788a8db7fce705a082d91109ce7c502109bb8"} Oct 06 07:02:20 crc kubenswrapper[4845]: I1006 07:02:20.237605 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c" path="/var/lib/kubelet/pods/7f1020a1-eea4-4a14-b9f3-9b8d1817ae1c/volumes" Oct 06 07:02:20 crc kubenswrapper[4845]: I1006 07:02:20.613912 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:02:20 crc kubenswrapper[4845]: I1006 07:02:20.678789 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d85d699f7-d86dl"] Oct 06 07:02:20 crc kubenswrapper[4845]: I1006 07:02:20.679070 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" podUID="3c0f4b65-1983-440e-a63a-e239797757eb" containerName="dnsmasq-dns" containerID="cri-o://7593053bad6c2906ce130b20b159c868b10c99ebd98ed1e464fcc5beb99d069a" gracePeriod=10 Oct 06 07:02:20 crc kubenswrapper[4845]: I1006 07:02:20.771365 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 07:02:20 crc kubenswrapper[4845]: E1006 07:02:20.815219 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c0f4b65_1983_440e_a63a_e239797757eb.slice/crio-7593053bad6c2906ce130b20b159c868b10c99ebd98ed1e464fcc5beb99d069a.scope\": RecentStats: unable to find data in memory cache]" Oct 06 07:02:20 crc kubenswrapper[4845]: I1006 07:02:20.851765 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.130919 4845 generic.go:334] "Generic (PLEG): container finished" podID="3c0f4b65-1983-440e-a63a-e239797757eb" containerID="7593053bad6c2906ce130b20b159c868b10c99ebd98ed1e464fcc5beb99d069a" exitCode=0 Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.131058 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" event={"ID":"3c0f4b65-1983-440e-a63a-e239797757eb","Type":"ContainerDied","Data":"7593053bad6c2906ce130b20b159c868b10c99ebd98ed1e464fcc5beb99d069a"} Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.133897 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6cd77935-7853-4e79-becd-771dfdb9cd4f" containerName="cinder-scheduler" containerID="cri-o://b3e1de9452bb05a34a9b45fa770610e273a0586ec4a41b4aa51c96304a7f14bb" gracePeriod=30 Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.134030 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6711858f-8c8e-49bf-afb8-836bcd2cb545","Type":"ContainerStarted","Data":"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a"} Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.134056 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6711858f-8c8e-49bf-afb8-836bcd2cb545","Type":"ContainerStarted","Data":"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8"} Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.134684 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6cd77935-7853-4e79-becd-771dfdb9cd4f" containerName="probe" containerID="cri-o://8088f5dfd5e83623d6837fc3ac0103337a8873d01801808a21111cb9036ff499" gracePeriod=30 Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.215379 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.303450 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-ovsdbserver-sb\") pod \"3c0f4b65-1983-440e-a63a-e239797757eb\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.304043 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-config\") pod \"3c0f4b65-1983-440e-a63a-e239797757eb\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.304197 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-dns-svc\") pod \"3c0f4b65-1983-440e-a63a-e239797757eb\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.304265 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-dns-swift-storage-0\") pod \"3c0f4b65-1983-440e-a63a-e239797757eb\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.304339 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55dnh\" (UniqueName: \"kubernetes.io/projected/3c0f4b65-1983-440e-a63a-e239797757eb-kube-api-access-55dnh\") pod \"3c0f4b65-1983-440e-a63a-e239797757eb\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.304471 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-ovsdbserver-nb\") pod \"3c0f4b65-1983-440e-a63a-e239797757eb\" (UID: \"3c0f4b65-1983-440e-a63a-e239797757eb\") " Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.331586 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0f4b65-1983-440e-a63a-e239797757eb-kube-api-access-55dnh" (OuterVolumeSpecName: "kube-api-access-55dnh") pod "3c0f4b65-1983-440e-a63a-e239797757eb" (UID: "3c0f4b65-1983-440e-a63a-e239797757eb"). InnerVolumeSpecName "kube-api-access-55dnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.406376 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55dnh\" (UniqueName: \"kubernetes.io/projected/3c0f4b65-1983-440e-a63a-e239797757eb-kube-api-access-55dnh\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.439823 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-config" (OuterVolumeSpecName: "config") pod "3c0f4b65-1983-440e-a63a-e239797757eb" (UID: "3c0f4b65-1983-440e-a63a-e239797757eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.460682 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3c0f4b65-1983-440e-a63a-e239797757eb" (UID: "3c0f4b65-1983-440e-a63a-e239797757eb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.469287 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3c0f4b65-1983-440e-a63a-e239797757eb" (UID: "3c0f4b65-1983-440e-a63a-e239797757eb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.478392 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3c0f4b65-1983-440e-a63a-e239797757eb" (UID: "3c0f4b65-1983-440e-a63a-e239797757eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.486157 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3c0f4b65-1983-440e-a63a-e239797757eb" (UID: "3c0f4b65-1983-440e-a63a-e239797757eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.508782 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.508812 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.508821 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.508830 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:21 crc kubenswrapper[4845]: I1006 07:02:21.508840 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c0f4b65-1983-440e-a63a-e239797757eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.145431 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6711858f-8c8e-49bf-afb8-836bcd2cb545","Type":"ContainerStarted","Data":"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8"} Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.147705 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" event={"ID":"3c0f4b65-1983-440e-a63a-e239797757eb","Type":"ContainerDied","Data":"52b9b4743bd4b7d4d44609d4380aa46830c70d7bcee93c15a48916b951e28be1"} Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.147733 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d85d699f7-d86dl" Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.147756 4845 scope.go:117] "RemoveContainer" containerID="7593053bad6c2906ce130b20b159c868b10c99ebd98ed1e464fcc5beb99d069a" Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.155835 4845 generic.go:334] "Generic (PLEG): container finished" podID="6cd77935-7853-4e79-becd-771dfdb9cd4f" containerID="8088f5dfd5e83623d6837fc3ac0103337a8873d01801808a21111cb9036ff499" exitCode=0 Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.155879 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6cd77935-7853-4e79-becd-771dfdb9cd4f","Type":"ContainerDied","Data":"8088f5dfd5e83623d6837fc3ac0103337a8873d01801808a21111cb9036ff499"} Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.176146 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d85d699f7-d86dl"] Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.177755 4845 scope.go:117] "RemoveContainer" containerID="2c122a44832cbca574968c775582e03d71a3d4ec67cf3ac007ec2d7e1fe56546" Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.193840 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d85d699f7-d86dl"] Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.237079 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c0f4b65-1983-440e-a63a-e239797757eb" path="/var/lib/kubelet/pods/3c0f4b65-1983-440e-a63a-e239797757eb/volumes" Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.867661 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-88b1-account-create-f9fm7"] Oct 06 07:02:22 crc kubenswrapper[4845]: E1006 07:02:22.868265 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0f4b65-1983-440e-a63a-e239797757eb" containerName="dnsmasq-dns" Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.868282 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0f4b65-1983-440e-a63a-e239797757eb" containerName="dnsmasq-dns" Oct 06 07:02:22 crc kubenswrapper[4845]: E1006 07:02:22.868298 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0f4b65-1983-440e-a63a-e239797757eb" containerName="init" Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.868304 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0f4b65-1983-440e-a63a-e239797757eb" containerName="init" Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.868561 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0f4b65-1983-440e-a63a-e239797757eb" containerName="dnsmasq-dns" Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.869160 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-88b1-account-create-f9fm7" Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.877769 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 06 07:02:22 crc kubenswrapper[4845]: I1006 07:02:22.888502 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-88b1-account-create-f9fm7"] Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.034308 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29bfx\" (UniqueName: \"kubernetes.io/projected/4f02af01-d26c-400e-9518-a458cb5db6b9-kube-api-access-29bfx\") pod \"nova-api-88b1-account-create-f9fm7\" (UID: \"4f02af01-d26c-400e-9518-a458cb5db6b9\") " pod="openstack/nova-api-88b1-account-create-f9fm7" Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.051849 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.052135 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f407712f-9d26-4102-a0c5-e7bfc61800ec" containerName="glance-log" containerID="cri-o://42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040" gracePeriod=30 Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.052286 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f407712f-9d26-4102-a0c5-e7bfc61800ec" containerName="glance-httpd" containerID="cri-o://1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710" gracePeriod=30 Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.076249 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4400-account-create-f658t"] Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.077492 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4400-account-create-f658t" Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.081757 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.091120 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4400-account-create-f658t"] Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.135938 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29bfx\" (UniqueName: \"kubernetes.io/projected/4f02af01-d26c-400e-9518-a458cb5db6b9-kube-api-access-29bfx\") pod \"nova-api-88b1-account-create-f9fm7\" (UID: \"4f02af01-d26c-400e-9518-a458cb5db6b9\") " pod="openstack/nova-api-88b1-account-create-f9fm7" Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.153491 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29bfx\" (UniqueName: \"kubernetes.io/projected/4f02af01-d26c-400e-9518-a458cb5db6b9-kube-api-access-29bfx\") pod \"nova-api-88b1-account-create-f9fm7\" (UID: \"4f02af01-d26c-400e-9518-a458cb5db6b9\") " pod="openstack/nova-api-88b1-account-create-f9fm7" Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.167053 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6711858f-8c8e-49bf-afb8-836bcd2cb545","Type":"ContainerStarted","Data":"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7"} Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.168055 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.196669 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-88b1-account-create-f9fm7" Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.242395 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpg8f\" (UniqueName: \"kubernetes.io/projected/94646571-314c-47a7-a651-e919dc640636-kube-api-access-cpg8f\") pod \"nova-cell0-4400-account-create-f658t\" (UID: \"94646571-314c-47a7-a651-e919dc640636\") " pod="openstack/nova-cell0-4400-account-create-f658t" Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.344558 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpg8f\" (UniqueName: \"kubernetes.io/projected/94646571-314c-47a7-a651-e919dc640636-kube-api-access-cpg8f\") pod \"nova-cell0-4400-account-create-f658t\" (UID: \"94646571-314c-47a7-a651-e919dc640636\") " pod="openstack/nova-cell0-4400-account-create-f658t" Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.364077 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpg8f\" (UniqueName: \"kubernetes.io/projected/94646571-314c-47a7-a651-e919dc640636-kube-api-access-cpg8f\") pod \"nova-cell0-4400-account-create-f658t\" (UID: \"94646571-314c-47a7-a651-e919dc640636\") " pod="openstack/nova-cell0-4400-account-create-f658t" Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.406793 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4400-account-create-f658t" Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.636922 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.980609717 podStartE2EDuration="4.636897316s" podCreationTimestamp="2025-10-06 07:02:19 +0000 UTC" firstStartedPulling="2025-10-06 07:02:19.955807236 +0000 UTC m=+1024.470548244" lastFinishedPulling="2025-10-06 07:02:22.612094835 +0000 UTC m=+1027.126835843" observedRunningTime="2025-10-06 07:02:23.196393749 +0000 UTC m=+1027.711134767" watchObservedRunningTime="2025-10-06 07:02:23.636897316 +0000 UTC m=+1028.151638324" Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.645905 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-88b1-account-create-f9fm7"] Oct 06 07:02:23 crc kubenswrapper[4845]: W1006 07:02:23.646592 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f02af01_d26c_400e_9518_a458cb5db6b9.slice/crio-b61d642aedbed438f6930077a47bbd54203744eee85551b7b0d7f03ba642fd0c WatchSource:0}: Error finding container b61d642aedbed438f6930077a47bbd54203744eee85551b7b0d7f03ba642fd0c: Status 404 returned error can't find the container with id b61d642aedbed438f6930077a47bbd54203744eee85551b7b0d7f03ba642fd0c Oct 06 07:02:23 crc kubenswrapper[4845]: I1006 07:02:23.864111 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4400-account-create-f658t"] Oct 06 07:02:23 crc kubenswrapper[4845]: W1006 07:02:23.965784 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94646571_314c_47a7_a651_e919dc640636.slice/crio-ba904d07aecf9f60b815f65e0e2348d9671faa5fe6b3a931b031d118e1c220d0 WatchSource:0}: Error finding container ba904d07aecf9f60b815f65e0e2348d9671faa5fe6b3a931b031d118e1c220d0: Status 404 returned error can't find the container with id ba904d07aecf9f60b815f65e0e2348d9671faa5fe6b3a931b031d118e1c220d0 Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.186901 4845 generic.go:334] "Generic (PLEG): container finished" podID="f407712f-9d26-4102-a0c5-e7bfc61800ec" containerID="42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040" exitCode=143 Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.186966 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f407712f-9d26-4102-a0c5-e7bfc61800ec","Type":"ContainerDied","Data":"42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040"} Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.189897 4845 generic.go:334] "Generic (PLEG): container finished" podID="4f02af01-d26c-400e-9518-a458cb5db6b9" containerID="7df2631063b929129bf29e4ef03ad22b72fd099d9aa469685ceb3afc67869674" exitCode=0 Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.189948 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-88b1-account-create-f9fm7" event={"ID":"4f02af01-d26c-400e-9518-a458cb5db6b9","Type":"ContainerDied","Data":"7df2631063b929129bf29e4ef03ad22b72fd099d9aa469685ceb3afc67869674"} Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.189978 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-88b1-account-create-f9fm7" event={"ID":"4f02af01-d26c-400e-9518-a458cb5db6b9","Type":"ContainerStarted","Data":"b61d642aedbed438f6930077a47bbd54203744eee85551b7b0d7f03ba642fd0c"} Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.192138 4845 generic.go:334] "Generic (PLEG): container finished" podID="6cd77935-7853-4e79-becd-771dfdb9cd4f" containerID="b3e1de9452bb05a34a9b45fa770610e273a0586ec4a41b4aa51c96304a7f14bb" exitCode=0 Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.192212 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6cd77935-7853-4e79-becd-771dfdb9cd4f","Type":"ContainerDied","Data":"b3e1de9452bb05a34a9b45fa770610e273a0586ec4a41b4aa51c96304a7f14bb"} Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.197348 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4400-account-create-f658t" event={"ID":"94646571-314c-47a7-a651-e919dc640636","Type":"ContainerStarted","Data":"5be17671e0451f2856f998311419d1377daba3d645ddaa74ae2e996455df601d"} Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.197432 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4400-account-create-f658t" event={"ID":"94646571-314c-47a7-a651-e919dc640636","Type":"ContainerStarted","Data":"ba904d07aecf9f60b815f65e0e2348d9671faa5fe6b3a931b031d118e1c220d0"} Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.237081 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-4400-account-create-f658t" podStartSLOduration=1.237042607 podStartE2EDuration="1.237042607s" podCreationTimestamp="2025-10-06 07:02:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:02:24.235559349 +0000 UTC m=+1028.750300377" watchObservedRunningTime="2025-10-06 07:02:24.237042607 +0000 UTC m=+1028.751783625" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.379616 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.474438 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgr7j\" (UniqueName: \"kubernetes.io/projected/6cd77935-7853-4e79-becd-771dfdb9cd4f-kube-api-access-pgr7j\") pod \"6cd77935-7853-4e79-becd-771dfdb9cd4f\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.474946 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-scripts\") pod \"6cd77935-7853-4e79-becd-771dfdb9cd4f\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.475010 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-combined-ca-bundle\") pod \"6cd77935-7853-4e79-becd-771dfdb9cd4f\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.475174 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cd77935-7853-4e79-becd-771dfdb9cd4f-etc-machine-id\") pod \"6cd77935-7853-4e79-becd-771dfdb9cd4f\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.475200 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-config-data-custom\") pod \"6cd77935-7853-4e79-becd-771dfdb9cd4f\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.475236 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-config-data\") pod \"6cd77935-7853-4e79-becd-771dfdb9cd4f\" (UID: \"6cd77935-7853-4e79-becd-771dfdb9cd4f\") " Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.475644 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6cd77935-7853-4e79-becd-771dfdb9cd4f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6cd77935-7853-4e79-becd-771dfdb9cd4f" (UID: "6cd77935-7853-4e79-becd-771dfdb9cd4f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.481096 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6cd77935-7853-4e79-becd-771dfdb9cd4f" (UID: "6cd77935-7853-4e79-becd-771dfdb9cd4f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.481533 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd77935-7853-4e79-becd-771dfdb9cd4f-kube-api-access-pgr7j" (OuterVolumeSpecName: "kube-api-access-pgr7j") pod "6cd77935-7853-4e79-becd-771dfdb9cd4f" (UID: "6cd77935-7853-4e79-becd-771dfdb9cd4f"). InnerVolumeSpecName "kube-api-access-pgr7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.481946 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-scripts" (OuterVolumeSpecName: "scripts") pod "6cd77935-7853-4e79-becd-771dfdb9cd4f" (UID: "6cd77935-7853-4e79-becd-771dfdb9cd4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.548715 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cd77935-7853-4e79-becd-771dfdb9cd4f" (UID: "6cd77935-7853-4e79-becd-771dfdb9cd4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.577428 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgr7j\" (UniqueName: \"kubernetes.io/projected/6cd77935-7853-4e79-becd-771dfdb9cd4f-kube-api-access-pgr7j\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.577638 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.577715 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.577798 4845 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cd77935-7853-4e79-becd-771dfdb9cd4f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.577869 4845 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.591527 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-config-data" (OuterVolumeSpecName: "config-data") pod "6cd77935-7853-4e79-becd-771dfdb9cd4f" (UID: "6cd77935-7853-4e79-becd-771dfdb9cd4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:24 crc kubenswrapper[4845]: I1006 07:02:24.680033 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd77935-7853-4e79-becd-771dfdb9cd4f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.011680 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.228592 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6cd77935-7853-4e79-becd-771dfdb9cd4f","Type":"ContainerDied","Data":"327c050635a1e6912f0f8d98b3e339d9d467d0a35e1a19b8ba543640d85532f5"} Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.228662 4845 scope.go:117] "RemoveContainer" containerID="8088f5dfd5e83623d6837fc3ac0103337a8873d01801808a21111cb9036ff499" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.228816 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.242717 4845 generic.go:334] "Generic (PLEG): container finished" podID="94646571-314c-47a7-a651-e919dc640636" containerID="5be17671e0451f2856f998311419d1377daba3d645ddaa74ae2e996455df601d" exitCode=0 Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.242797 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4400-account-create-f658t" event={"ID":"94646571-314c-47a7-a651-e919dc640636","Type":"ContainerDied","Data":"5be17671e0451f2856f998311419d1377daba3d645ddaa74ae2e996455df601d"} Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.302733 4845 scope.go:117] "RemoveContainer" containerID="b3e1de9452bb05a34a9b45fa770610e273a0586ec4a41b4aa51c96304a7f14bb" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.341639 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.357116 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.380961 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 07:02:25 crc kubenswrapper[4845]: E1006 07:02:25.381580 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd77935-7853-4e79-becd-771dfdb9cd4f" containerName="probe" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.381597 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd77935-7853-4e79-becd-771dfdb9cd4f" containerName="probe" Oct 06 07:02:25 crc kubenswrapper[4845]: E1006 07:02:25.381608 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd77935-7853-4e79-becd-771dfdb9cd4f" containerName="cinder-scheduler" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.381614 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd77935-7853-4e79-becd-771dfdb9cd4f" containerName="cinder-scheduler" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.381915 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd77935-7853-4e79-becd-771dfdb9cd4f" containerName="cinder-scheduler" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.381951 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd77935-7853-4e79-becd-771dfdb9cd4f" containerName="probe" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.383365 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.389330 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.417322 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.499134 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9789203-7142-4ed7-b8db-7105d5233557-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.499205 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9789203-7142-4ed7-b8db-7105d5233557-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.499226 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzmkz\" (UniqueName: \"kubernetes.io/projected/d9789203-7142-4ed7-b8db-7105d5233557-kube-api-access-xzmkz\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.499764 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9789203-7142-4ed7-b8db-7105d5233557-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.499893 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9789203-7142-4ed7-b8db-7105d5233557-config-data\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.499949 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9789203-7142-4ed7-b8db-7105d5233557-scripts\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.602248 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9789203-7142-4ed7-b8db-7105d5233557-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.602331 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzmkz\" (UniqueName: \"kubernetes.io/projected/d9789203-7142-4ed7-b8db-7105d5233557-kube-api-access-xzmkz\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.602354 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9789203-7142-4ed7-b8db-7105d5233557-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.602413 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9789203-7142-4ed7-b8db-7105d5233557-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.602453 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9789203-7142-4ed7-b8db-7105d5233557-config-data\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.602478 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9789203-7142-4ed7-b8db-7105d5233557-scripts\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.602762 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9789203-7142-4ed7-b8db-7105d5233557-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.607281 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9789203-7142-4ed7-b8db-7105d5233557-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.609082 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9789203-7142-4ed7-b8db-7105d5233557-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.611019 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9789203-7142-4ed7-b8db-7105d5233557-config-data\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.611497 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9789203-7142-4ed7-b8db-7105d5233557-scripts\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.624851 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzmkz\" (UniqueName: \"kubernetes.io/projected/d9789203-7142-4ed7-b8db-7105d5233557-kube-api-access-xzmkz\") pod \"cinder-scheduler-0\" (UID: \"d9789203-7142-4ed7-b8db-7105d5233557\") " pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.701268 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-88b1-account-create-f9fm7" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.725185 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.771774 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.772636 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="74662789-933b-40b7-8e5b-2a3e32ba08f4" containerName="glance-httpd" containerID="cri-o://e01e6a18db4f10dbbf8fa6de9f9b4a5f67a783a8541148211911eefe4e6dbe7d" gracePeriod=30 Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.772776 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="74662789-933b-40b7-8e5b-2a3e32ba08f4" containerName="glance-log" containerID="cri-o://a3548ec736a947129ca3d614632af2fa0d86e4351c4a16563b2891f660f68b28" gracePeriod=30 Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.806455 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29bfx\" (UniqueName: \"kubernetes.io/projected/4f02af01-d26c-400e-9518-a458cb5db6b9-kube-api-access-29bfx\") pod \"4f02af01-d26c-400e-9518-a458cb5db6b9\" (UID: \"4f02af01-d26c-400e-9518-a458cb5db6b9\") " Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.816509 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f02af01-d26c-400e-9518-a458cb5db6b9-kube-api-access-29bfx" (OuterVolumeSpecName: "kube-api-access-29bfx") pod "4f02af01-d26c-400e-9518-a458cb5db6b9" (UID: "4f02af01-d26c-400e-9518-a458cb5db6b9"). InnerVolumeSpecName "kube-api-access-29bfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:25 crc kubenswrapper[4845]: I1006 07:02:25.909448 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29bfx\" (UniqueName: \"kubernetes.io/projected/4f02af01-d26c-400e-9518-a458cb5db6b9-kube-api-access-29bfx\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.251483 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd77935-7853-4e79-becd-771dfdb9cd4f" path="/var/lib/kubelet/pods/6cd77935-7853-4e79-becd-771dfdb9cd4f/volumes" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.258984 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.277050 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-88b1-account-create-f9fm7" event={"ID":"4f02af01-d26c-400e-9518-a458cb5db6b9","Type":"ContainerDied","Data":"b61d642aedbed438f6930077a47bbd54203744eee85551b7b0d7f03ba642fd0c"} Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.277087 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b61d642aedbed438f6930077a47bbd54203744eee85551b7b0d7f03ba642fd0c" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.277190 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-88b1-account-create-f9fm7" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.295373 4845 generic.go:334] "Generic (PLEG): container finished" podID="74662789-933b-40b7-8e5b-2a3e32ba08f4" containerID="a3548ec736a947129ca3d614632af2fa0d86e4351c4a16563b2891f660f68b28" exitCode=143 Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.295548 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74662789-933b-40b7-8e5b-2a3e32ba08f4","Type":"ContainerDied","Data":"a3548ec736a947129ca3d614632af2fa0d86e4351c4a16563b2891f660f68b28"} Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.295737 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="ceilometer-central-agent" containerID="cri-o://f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8" gracePeriod=30 Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.296688 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="proxy-httpd" containerID="cri-o://ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7" gracePeriod=30 Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.296770 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="sg-core" containerID="cri-o://6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8" gracePeriod=30 Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.296803 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="ceilometer-notification-agent" containerID="cri-o://737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a" gracePeriod=30 Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.789998 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4400-account-create-f658t" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.842739 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.934614 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-config-data\") pod \"f407712f-9d26-4102-a0c5-e7bfc61800ec\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.934680 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f407712f-9d26-4102-a0c5-e7bfc61800ec-httpd-run\") pod \"f407712f-9d26-4102-a0c5-e7bfc61800ec\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.934729 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"f407712f-9d26-4102-a0c5-e7bfc61800ec\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.935639 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f407712f-9d26-4102-a0c5-e7bfc61800ec-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f407712f-9d26-4102-a0c5-e7bfc61800ec" (UID: "f407712f-9d26-4102-a0c5-e7bfc61800ec"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.936472 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpg8f\" (UniqueName: \"kubernetes.io/projected/94646571-314c-47a7-a651-e919dc640636-kube-api-access-cpg8f\") pod \"94646571-314c-47a7-a651-e919dc640636\" (UID: \"94646571-314c-47a7-a651-e919dc640636\") " Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.936517 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-scripts\") pod \"f407712f-9d26-4102-a0c5-e7bfc61800ec\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.936550 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8smt\" (UniqueName: \"kubernetes.io/projected/f407712f-9d26-4102-a0c5-e7bfc61800ec-kube-api-access-h8smt\") pod \"f407712f-9d26-4102-a0c5-e7bfc61800ec\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.936577 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-combined-ca-bundle\") pod \"f407712f-9d26-4102-a0c5-e7bfc61800ec\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.936613 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f407712f-9d26-4102-a0c5-e7bfc61800ec-logs\") pod \"f407712f-9d26-4102-a0c5-e7bfc61800ec\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.936639 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-public-tls-certs\") pod \"f407712f-9d26-4102-a0c5-e7bfc61800ec\" (UID: \"f407712f-9d26-4102-a0c5-e7bfc61800ec\") " Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.937777 4845 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f407712f-9d26-4102-a0c5-e7bfc61800ec-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.938211 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f407712f-9d26-4102-a0c5-e7bfc61800ec-logs" (OuterVolumeSpecName: "logs") pod "f407712f-9d26-4102-a0c5-e7bfc61800ec" (UID: "f407712f-9d26-4102-a0c5-e7bfc61800ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.943117 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-scripts" (OuterVolumeSpecName: "scripts") pod "f407712f-9d26-4102-a0c5-e7bfc61800ec" (UID: "f407712f-9d26-4102-a0c5-e7bfc61800ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.943480 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f407712f-9d26-4102-a0c5-e7bfc61800ec-kube-api-access-h8smt" (OuterVolumeSpecName: "kube-api-access-h8smt") pod "f407712f-9d26-4102-a0c5-e7bfc61800ec" (UID: "f407712f-9d26-4102-a0c5-e7bfc61800ec"). InnerVolumeSpecName "kube-api-access-h8smt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.944619 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94646571-314c-47a7-a651-e919dc640636-kube-api-access-cpg8f" (OuterVolumeSpecName: "kube-api-access-cpg8f") pod "94646571-314c-47a7-a651-e919dc640636" (UID: "94646571-314c-47a7-a651-e919dc640636"). InnerVolumeSpecName "kube-api-access-cpg8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.947668 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "f407712f-9d26-4102-a0c5-e7bfc61800ec" (UID: "f407712f-9d26-4102-a0c5-e7bfc61800ec"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 07:02:26 crc kubenswrapper[4845]: I1006 07:02:26.968906 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f407712f-9d26-4102-a0c5-e7bfc61800ec" (UID: "f407712f-9d26-4102-a0c5-e7bfc61800ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.039312 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f407712f-9d26-4102-a0c5-e7bfc61800ec" (UID: "f407712f-9d26-4102-a0c5-e7bfc61800ec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.039688 4845 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.039759 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpg8f\" (UniqueName: \"kubernetes.io/projected/94646571-314c-47a7-a651-e919dc640636-kube-api-access-cpg8f\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.039819 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.039893 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8smt\" (UniqueName: \"kubernetes.io/projected/f407712f-9d26-4102-a0c5-e7bfc61800ec-kube-api-access-h8smt\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.039947 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.040000 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f407712f-9d26-4102-a0c5-e7bfc61800ec-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.040055 4845 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.049871 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-config-data" (OuterVolumeSpecName: "config-data") pod "f407712f-9d26-4102-a0c5-e7bfc61800ec" (UID: "f407712f-9d26-4102-a0c5-e7bfc61800ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.073001 4845 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.142202 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f407712f-9d26-4102-a0c5-e7bfc61800ec-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.142230 4845 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.143674 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.243757 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6711858f-8c8e-49bf-afb8-836bcd2cb545-log-httpd\") pod \"6711858f-8c8e-49bf-afb8-836bcd2cb545\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.243809 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xjm\" (UniqueName: \"kubernetes.io/projected/6711858f-8c8e-49bf-afb8-836bcd2cb545-kube-api-access-w4xjm\") pod \"6711858f-8c8e-49bf-afb8-836bcd2cb545\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.243870 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-sg-core-conf-yaml\") pod \"6711858f-8c8e-49bf-afb8-836bcd2cb545\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.243896 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6711858f-8c8e-49bf-afb8-836bcd2cb545-run-httpd\") pod \"6711858f-8c8e-49bf-afb8-836bcd2cb545\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.243979 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-config-data\") pod \"6711858f-8c8e-49bf-afb8-836bcd2cb545\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.244037 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-scripts\") pod \"6711858f-8c8e-49bf-afb8-836bcd2cb545\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.244109 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-combined-ca-bundle\") pod \"6711858f-8c8e-49bf-afb8-836bcd2cb545\" (UID: \"6711858f-8c8e-49bf-afb8-836bcd2cb545\") " Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.244194 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6711858f-8c8e-49bf-afb8-836bcd2cb545-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6711858f-8c8e-49bf-afb8-836bcd2cb545" (UID: "6711858f-8c8e-49bf-afb8-836bcd2cb545"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.244498 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6711858f-8c8e-49bf-afb8-836bcd2cb545-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6711858f-8c8e-49bf-afb8-836bcd2cb545" (UID: "6711858f-8c8e-49bf-afb8-836bcd2cb545"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.244648 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6711858f-8c8e-49bf-afb8-836bcd2cb545-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.244662 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6711858f-8c8e-49bf-afb8-836bcd2cb545-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.250556 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6711858f-8c8e-49bf-afb8-836bcd2cb545-kube-api-access-w4xjm" (OuterVolumeSpecName: "kube-api-access-w4xjm") pod "6711858f-8c8e-49bf-afb8-836bcd2cb545" (UID: "6711858f-8c8e-49bf-afb8-836bcd2cb545"). InnerVolumeSpecName "kube-api-access-w4xjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.251466 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-scripts" (OuterVolumeSpecName: "scripts") pod "6711858f-8c8e-49bf-afb8-836bcd2cb545" (UID: "6711858f-8c8e-49bf-afb8-836bcd2cb545"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.314461 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d9789203-7142-4ed7-b8db-7105d5233557","Type":"ContainerStarted","Data":"d9755ee2b19a89a919cf6092f8261594541a4b1b54bea4c97123f5bca3c72738"} Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.314514 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d9789203-7142-4ed7-b8db-7105d5233557","Type":"ContainerStarted","Data":"2bce90e0879e2edc213af2c92f5ac5b895b4d250c8b45074247fbb514aaf7286"} Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.319779 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4400-account-create-f658t" event={"ID":"94646571-314c-47a7-a651-e919dc640636","Type":"ContainerDied","Data":"ba904d07aecf9f60b815f65e0e2348d9671faa5fe6b3a931b031d118e1c220d0"} Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.319828 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba904d07aecf9f60b815f65e0e2348d9671faa5fe6b3a931b031d118e1c220d0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.319902 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4400-account-create-f658t" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.320359 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6711858f-8c8e-49bf-afb8-836bcd2cb545" (UID: "6711858f-8c8e-49bf-afb8-836bcd2cb545"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.332312 4845 generic.go:334] "Generic (PLEG): container finished" podID="f407712f-9d26-4102-a0c5-e7bfc61800ec" containerID="1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710" exitCode=0 Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.332443 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f407712f-9d26-4102-a0c5-e7bfc61800ec","Type":"ContainerDied","Data":"1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710"} Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.332474 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f407712f-9d26-4102-a0c5-e7bfc61800ec","Type":"ContainerDied","Data":"5394291eb190a1e54fef3c4d3ef8efe0723abeb74dbc8b36eab5a343959aedfc"} Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.332493 4845 scope.go:117] "RemoveContainer" containerID="1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.332620 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.344537 4845 generic.go:334] "Generic (PLEG): container finished" podID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerID="ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7" exitCode=0 Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.344582 4845 generic.go:334] "Generic (PLEG): container finished" podID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerID="6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8" exitCode=2 Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.344591 4845 generic.go:334] "Generic (PLEG): container finished" podID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerID="737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a" exitCode=0 Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.344600 4845 generic.go:334] "Generic (PLEG): container finished" podID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerID="f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8" exitCode=0 Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.344621 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6711858f-8c8e-49bf-afb8-836bcd2cb545","Type":"ContainerDied","Data":"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7"} Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.344649 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6711858f-8c8e-49bf-afb8-836bcd2cb545","Type":"ContainerDied","Data":"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8"} Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.344659 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6711858f-8c8e-49bf-afb8-836bcd2cb545","Type":"ContainerDied","Data":"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a"} Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.344669 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6711858f-8c8e-49bf-afb8-836bcd2cb545","Type":"ContainerDied","Data":"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8"} Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.344679 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6711858f-8c8e-49bf-afb8-836bcd2cb545","Type":"ContainerDied","Data":"1ef8c4f48bc771c02e72f5b6a49788a8db7fce705a082d91109ce7c502109bb8"} Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.344777 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.346808 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xjm\" (UniqueName: \"kubernetes.io/projected/6711858f-8c8e-49bf-afb8-836bcd2cb545-kube-api-access-w4xjm\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.346833 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.346843 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.377254 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-config-data" (OuterVolumeSpecName: "config-data") pod "6711858f-8c8e-49bf-afb8-836bcd2cb545" (UID: "6711858f-8c8e-49bf-afb8-836bcd2cb545"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.398904 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6711858f-8c8e-49bf-afb8-836bcd2cb545" (UID: "6711858f-8c8e-49bf-afb8-836bcd2cb545"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.449781 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.449836 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6711858f-8c8e-49bf-afb8-836bcd2cb545-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.499008 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.506357 4845 scope.go:117] "RemoveContainer" containerID="42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.510443 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522018 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.522404 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f407712f-9d26-4102-a0c5-e7bfc61800ec" containerName="glance-log" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522419 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f407712f-9d26-4102-a0c5-e7bfc61800ec" containerName="glance-log" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.522444 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="ceilometer-central-agent" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522452 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="ceilometer-central-agent" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.522461 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f02af01-d26c-400e-9518-a458cb5db6b9" containerName="mariadb-account-create" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522467 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f02af01-d26c-400e-9518-a458cb5db6b9" containerName="mariadb-account-create" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.522477 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94646571-314c-47a7-a651-e919dc640636" containerName="mariadb-account-create" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522483 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="94646571-314c-47a7-a651-e919dc640636" containerName="mariadb-account-create" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.522497 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="sg-core" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522503 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="sg-core" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.522520 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="ceilometer-notification-agent" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522527 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="ceilometer-notification-agent" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.522540 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="proxy-httpd" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522547 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="proxy-httpd" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.522555 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f407712f-9d26-4102-a0c5-e7bfc61800ec" containerName="glance-httpd" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522561 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f407712f-9d26-4102-a0c5-e7bfc61800ec" containerName="glance-httpd" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522722 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="ceilometer-notification-agent" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522738 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f407712f-9d26-4102-a0c5-e7bfc61800ec" containerName="glance-log" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522752 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="proxy-httpd" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522759 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f407712f-9d26-4102-a0c5-e7bfc61800ec" containerName="glance-httpd" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522774 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f02af01-d26c-400e-9518-a458cb5db6b9" containerName="mariadb-account-create" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522781 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="ceilometer-central-agent" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522791 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" containerName="sg-core" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.522798 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="94646571-314c-47a7-a651-e919dc640636" containerName="mariadb-account-create" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.523784 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.526052 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.529787 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.530769 4845 scope.go:117] "RemoveContainer" containerID="1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.531412 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710\": container with ID starting with 1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710 not found: ID does not exist" containerID="1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.531450 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710"} err="failed to get container status \"1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710\": rpc error: code = NotFound desc = could not find container \"1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710\": container with ID starting with 1ce44fd2833ce3a9bf718f3299f27f46514d703a42fd910a8f6d20006f404710 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.531477 4845 scope.go:117] "RemoveContainer" containerID="42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.531766 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040\": container with ID starting with 42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040 not found: ID does not exist" containerID="42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.531796 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040"} err="failed to get container status \"42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040\": rpc error: code = NotFound desc = could not find container \"42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040\": container with ID starting with 42e40e0302e0b1b70505c7633e8c3883db0e48f8173b554e87e6c706ef6d3040 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.531818 4845 scope.go:117] "RemoveContainer" containerID="ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.533134 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.569638 4845 scope.go:117] "RemoveContainer" containerID="6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.600105 4845 scope.go:117] "RemoveContainer" containerID="737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.627281 4845 scope.go:117] "RemoveContainer" containerID="f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.660220 4845 scope.go:117] "RemoveContainer" containerID="ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.663848 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7\": container with ID starting with ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7 not found: ID does not exist" containerID="ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.663908 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7"} err="failed to get container status \"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7\": rpc error: code = NotFound desc = could not find container \"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7\": container with ID starting with ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.663941 4845 scope.go:117] "RemoveContainer" containerID="6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.664339 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8\": container with ID starting with 6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8 not found: ID does not exist" containerID="6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.666006 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8"} err="failed to get container status \"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8\": rpc error: code = NotFound desc = could not find container \"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8\": container with ID starting with 6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.666048 4845 scope.go:117] "RemoveContainer" containerID="737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.667904 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a\": container with ID starting with 737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a not found: ID does not exist" containerID="737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.667958 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a"} err="failed to get container status \"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a\": rpc error: code = NotFound desc = could not find container \"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a\": container with ID starting with 737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.667988 4845 scope.go:117] "RemoveContainer" containerID="f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8" Oct 06 07:02:27 crc kubenswrapper[4845]: E1006 07:02:27.671690 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8\": container with ID starting with f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8 not found: ID does not exist" containerID="f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.671716 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8"} err="failed to get container status \"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8\": rpc error: code = NotFound desc = could not find container \"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8\": container with ID starting with f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.671758 4845 scope.go:117] "RemoveContainer" containerID="ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.675767 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7"} err="failed to get container status \"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7\": rpc error: code = NotFound desc = could not find container \"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7\": container with ID starting with ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.676538 4845 scope.go:117] "RemoveContainer" containerID="6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.679162 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8"} err="failed to get container status \"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8\": rpc error: code = NotFound desc = could not find container \"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8\": container with ID starting with 6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.679227 4845 scope.go:117] "RemoveContainer" containerID="737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.680158 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a"} err="failed to get container status \"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a\": rpc error: code = NotFound desc = could not find container \"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a\": container with ID starting with 737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.680187 4845 scope.go:117] "RemoveContainer" containerID="f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.682318 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e1706a-220a-4291-b2b3-1b79660ec95b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.682396 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e1706a-220a-4291-b2b3-1b79660ec95b-config-data\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.682529 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e1706a-220a-4291-b2b3-1b79660ec95b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.682552 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07e1706a-220a-4291-b2b3-1b79660ec95b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.682691 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07e1706a-220a-4291-b2b3-1b79660ec95b-logs\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.682745 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.682785 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e1706a-220a-4291-b2b3-1b79660ec95b-scripts\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.682843 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gspk5\" (UniqueName: \"kubernetes.io/projected/07e1706a-220a-4291-b2b3-1b79660ec95b-kube-api-access-gspk5\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.683708 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8"} err="failed to get container status \"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8\": rpc error: code = NotFound desc = could not find container \"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8\": container with ID starting with f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.683767 4845 scope.go:117] "RemoveContainer" containerID="ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.684256 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7"} err="failed to get container status \"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7\": rpc error: code = NotFound desc = could not find container \"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7\": container with ID starting with ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.684303 4845 scope.go:117] "RemoveContainer" containerID="6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.685260 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8"} err="failed to get container status \"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8\": rpc error: code = NotFound desc = could not find container \"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8\": container with ID starting with 6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.685330 4845 scope.go:117] "RemoveContainer" containerID="737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.686167 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a"} err="failed to get container status \"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a\": rpc error: code = NotFound desc = could not find container \"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a\": container with ID starting with 737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.686209 4845 scope.go:117] "RemoveContainer" containerID="f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.687154 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8"} err="failed to get container status \"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8\": rpc error: code = NotFound desc = could not find container \"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8\": container with ID starting with f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.687856 4845 scope.go:117] "RemoveContainer" containerID="ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.688504 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7"} err="failed to get container status \"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7\": rpc error: code = NotFound desc = could not find container \"ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7\": container with ID starting with ad27c0b71b5f35af6fd3f8aba05c81df40e59af29c66bd043eb31654724e10d7 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.688545 4845 scope.go:117] "RemoveContainer" containerID="6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.689635 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8"} err="failed to get container status \"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8\": rpc error: code = NotFound desc = could not find container \"6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8\": container with ID starting with 6e032a9e018813b36639e3cf91c43c012355e7bb77ad8a7b048e101a5894a5b8 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.693677 4845 scope.go:117] "RemoveContainer" containerID="737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.715191 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a"} err="failed to get container status \"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a\": rpc error: code = NotFound desc = could not find container \"737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a\": container with ID starting with 737d18546f81dc8497a218d322e711c9b886f82d24b247964d0749630938f43a not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.715489 4845 scope.go:117] "RemoveContainer" containerID="f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.718072 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8"} err="failed to get container status \"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8\": rpc error: code = NotFound desc = could not find container \"f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8\": container with ID starting with f42bc254ae60ee3788706cc9f65520ad64ca8d23a5d229804a6f387c938df1a8 not found: ID does not exist" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.729596 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.736126 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.755664 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.759917 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.761017 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.761977 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.762268 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.784451 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e1706a-220a-4291-b2b3-1b79660ec95b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.784722 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e1706a-220a-4291-b2b3-1b79660ec95b-config-data\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.784819 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e1706a-220a-4291-b2b3-1b79660ec95b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.784885 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07e1706a-220a-4291-b2b3-1b79660ec95b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.784970 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07e1706a-220a-4291-b2b3-1b79660ec95b-logs\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.785037 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.785111 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e1706a-220a-4291-b2b3-1b79660ec95b-scripts\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.785179 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gspk5\" (UniqueName: \"kubernetes.io/projected/07e1706a-220a-4291-b2b3-1b79660ec95b-kube-api-access-gspk5\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.787717 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07e1706a-220a-4291-b2b3-1b79660ec95b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.788190 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07e1706a-220a-4291-b2b3-1b79660ec95b-logs\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.787832 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.789522 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e1706a-220a-4291-b2b3-1b79660ec95b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.793006 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e1706a-220a-4291-b2b3-1b79660ec95b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.795524 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e1706a-220a-4291-b2b3-1b79660ec95b-scripts\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.803851 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gspk5\" (UniqueName: \"kubernetes.io/projected/07e1706a-220a-4291-b2b3-1b79660ec95b-kube-api-access-gspk5\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.811182 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e1706a-220a-4291-b2b3-1b79660ec95b-config-data\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.819167 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"07e1706a-220a-4291-b2b3-1b79660ec95b\") " pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.850788 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.881296 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.887067 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-config-data\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.887130 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.887176 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-log-httpd\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.887202 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-run-httpd\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.887366 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-scripts\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.887435 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.887452 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv4hx\" (UniqueName: \"kubernetes.io/projected/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-kube-api-access-xv4hx\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.991380 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-config-data\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.991759 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.991793 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-log-httpd\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.991823 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-run-httpd\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.991965 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-scripts\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.991995 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.992017 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv4hx\" (UniqueName: \"kubernetes.io/projected/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-kube-api-access-xv4hx\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.993539 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-run-httpd\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.993776 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-log-httpd\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.997527 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:27 crc kubenswrapper[4845]: I1006 07:02:27.998119 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.000932 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-scripts\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.001275 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-config-data\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.016821 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv4hx\" (UniqueName: \"kubernetes.io/projected/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-kube-api-access-xv4hx\") pod \"ceilometer-0\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " pod="openstack/ceilometer-0" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.084859 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.244667 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6711858f-8c8e-49bf-afb8-836bcd2cb545" path="/var/lib/kubelet/pods/6711858f-8c8e-49bf-afb8-836bcd2cb545/volumes" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.245857 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f407712f-9d26-4102-a0c5-e7bfc61800ec" path="/var/lib/kubelet/pods/f407712f-9d26-4102-a0c5-e7bfc61800ec/volumes" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.350787 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ct7br"] Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.351992 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.356101 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.356222 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.360739 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tghcz" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.367313 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ct7br"] Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.380359 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d9789203-7142-4ed7-b8db-7105d5233557","Type":"ContainerStarted","Data":"649f5f153807e4dccd14b817b3389e5d613579712940badd99cdbaecd59e8563"} Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.473023 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.473002533 podStartE2EDuration="3.473002533s" podCreationTimestamp="2025-10-06 07:02:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:02:28.412634411 +0000 UTC m=+1032.927375419" watchObservedRunningTime="2025-10-06 07:02:28.473002533 +0000 UTC m=+1032.987743551" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.482416 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.501693 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ct7br\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.501840 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-scripts\") pod \"nova-cell0-conductor-db-sync-ct7br\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.502753 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2c4d\" (UniqueName: \"kubernetes.io/projected/18f578e1-f6a1-4986-99af-0ab3a17cef8a-kube-api-access-s2c4d\") pod \"nova-cell0-conductor-db-sync-ct7br\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.502972 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-config-data\") pod \"nova-cell0-conductor-db-sync-ct7br\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.606221 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ct7br\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.606323 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-scripts\") pod \"nova-cell0-conductor-db-sync-ct7br\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.606360 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2c4d\" (UniqueName: \"kubernetes.io/projected/18f578e1-f6a1-4986-99af-0ab3a17cef8a-kube-api-access-s2c4d\") pod \"nova-cell0-conductor-db-sync-ct7br\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.606459 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-config-data\") pod \"nova-cell0-conductor-db-sync-ct7br\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.611996 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-scripts\") pod \"nova-cell0-conductor-db-sync-ct7br\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.623280 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ct7br\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.625917 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-config-data\") pod \"nova-cell0-conductor-db-sync-ct7br\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.627859 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2c4d\" (UniqueName: \"kubernetes.io/projected/18f578e1-f6a1-4986-99af-0ab3a17cef8a-kube-api-access-s2c4d\") pod \"nova-cell0-conductor-db-sync-ct7br\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.628448 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:28 crc kubenswrapper[4845]: I1006 07:02:28.678882 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.220984 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ct7br"] Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.406803 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07e1706a-220a-4291-b2b3-1b79660ec95b","Type":"ContainerStarted","Data":"9702a54673be84fc8064b033e926bb35add7351b9b026fcded90c4265e007c32"} Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.406840 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07e1706a-220a-4291-b2b3-1b79660ec95b","Type":"ContainerStarted","Data":"e5140b76f20743c768cc82f2d5f2128542725f8c76bd0a4844e99bb15944fde6"} Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.434207 4845 generic.go:334] "Generic (PLEG): container finished" podID="74662789-933b-40b7-8e5b-2a3e32ba08f4" containerID="e01e6a18db4f10dbbf8fa6de9f9b4a5f67a783a8541148211911eefe4e6dbe7d" exitCode=0 Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.434573 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74662789-933b-40b7-8e5b-2a3e32ba08f4","Type":"ContainerDied","Data":"e01e6a18db4f10dbbf8fa6de9f9b4a5f67a783a8541148211911eefe4e6dbe7d"} Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.449963 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ct7br" event={"ID":"18f578e1-f6a1-4986-99af-0ab3a17cef8a","Type":"ContainerStarted","Data":"d1d8e0679ac4f7c6cecbfb21878aeaeb544fe92093dbfeb2a1997f9cdd92a0c6"} Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.451885 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43802b75-eb4d-43b8-ab8c-0b84a7b604e7","Type":"ContainerStarted","Data":"e29ec5199bbb33b06248779665371f082450f10773dd8eeedb3be1bc16f6305c"} Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.451962 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43802b75-eb4d-43b8-ab8c-0b84a7b604e7","Type":"ContainerStarted","Data":"334a0268555e8ef9a267cc3b13212d0c53300c3a69dbc1f5995d18519d893d88"} Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.611148 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.734203 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74662789-933b-40b7-8e5b-2a3e32ba08f4-httpd-run\") pod \"74662789-933b-40b7-8e5b-2a3e32ba08f4\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.734569 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-scripts\") pod \"74662789-933b-40b7-8e5b-2a3e32ba08f4\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.734622 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-config-data\") pod \"74662789-933b-40b7-8e5b-2a3e32ba08f4\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.734650 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74662789-933b-40b7-8e5b-2a3e32ba08f4-logs\") pod \"74662789-933b-40b7-8e5b-2a3e32ba08f4\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.734738 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-combined-ca-bundle\") pod \"74662789-933b-40b7-8e5b-2a3e32ba08f4\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.734776 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-internal-tls-certs\") pod \"74662789-933b-40b7-8e5b-2a3e32ba08f4\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.734859 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7x2h\" (UniqueName: \"kubernetes.io/projected/74662789-933b-40b7-8e5b-2a3e32ba08f4-kube-api-access-t7x2h\") pod \"74662789-933b-40b7-8e5b-2a3e32ba08f4\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.734878 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"74662789-933b-40b7-8e5b-2a3e32ba08f4\" (UID: \"74662789-933b-40b7-8e5b-2a3e32ba08f4\") " Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.735326 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74662789-933b-40b7-8e5b-2a3e32ba08f4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "74662789-933b-40b7-8e5b-2a3e32ba08f4" (UID: "74662789-933b-40b7-8e5b-2a3e32ba08f4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.735526 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74662789-933b-40b7-8e5b-2a3e32ba08f4-logs" (OuterVolumeSpecName: "logs") pod "74662789-933b-40b7-8e5b-2a3e32ba08f4" (UID: "74662789-933b-40b7-8e5b-2a3e32ba08f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.760016 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-scripts" (OuterVolumeSpecName: "scripts") pod "74662789-933b-40b7-8e5b-2a3e32ba08f4" (UID: "74662789-933b-40b7-8e5b-2a3e32ba08f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.771585 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "74662789-933b-40b7-8e5b-2a3e32ba08f4" (UID: "74662789-933b-40b7-8e5b-2a3e32ba08f4"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.772719 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74662789-933b-40b7-8e5b-2a3e32ba08f4-kube-api-access-t7x2h" (OuterVolumeSpecName: "kube-api-access-t7x2h") pod "74662789-933b-40b7-8e5b-2a3e32ba08f4" (UID: "74662789-933b-40b7-8e5b-2a3e32ba08f4"). InnerVolumeSpecName "kube-api-access-t7x2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.805691 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-config-data" (OuterVolumeSpecName: "config-data") pod "74662789-933b-40b7-8e5b-2a3e32ba08f4" (UID: "74662789-933b-40b7-8e5b-2a3e32ba08f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.815835 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74662789-933b-40b7-8e5b-2a3e32ba08f4" (UID: "74662789-933b-40b7-8e5b-2a3e32ba08f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.837295 4845 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74662789-933b-40b7-8e5b-2a3e32ba08f4-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.837325 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.837333 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.837343 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74662789-933b-40b7-8e5b-2a3e32ba08f4-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.837353 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.837365 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7x2h\" (UniqueName: \"kubernetes.io/projected/74662789-933b-40b7-8e5b-2a3e32ba08f4-kube-api-access-t7x2h\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.837425 4845 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.867117 4845 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.899060 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "74662789-933b-40b7-8e5b-2a3e32ba08f4" (UID: "74662789-933b-40b7-8e5b-2a3e32ba08f4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.939335 4845 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:29 crc kubenswrapper[4845]: I1006 07:02:29.939373 4845 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74662789-933b-40b7-8e5b-2a3e32ba08f4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.066588 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.462239 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07e1706a-220a-4291-b2b3-1b79660ec95b","Type":"ContainerStarted","Data":"2d443d1bfb93ba5c58cfb43f6d80d64e20449dd067cb012f704bc41fd04bdea4"} Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.465465 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74662789-933b-40b7-8e5b-2a3e32ba08f4","Type":"ContainerDied","Data":"615aca2ba944a15f2bc0f3a05573075fdb3afc2612a384c5ddc34da42368868b"} Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.465505 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.465517 4845 scope.go:117] "RemoveContainer" containerID="e01e6a18db4f10dbbf8fa6de9f9b4a5f67a783a8541148211911eefe4e6dbe7d" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.472238 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43802b75-eb4d-43b8-ab8c-0b84a7b604e7","Type":"ContainerStarted","Data":"322faeb01f6fe40be4c59a1d7b0a7238010e33e9d46a69510c6502a141568184"} Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.485455 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.485436917 podStartE2EDuration="3.485436917s" podCreationTimestamp="2025-10-06 07:02:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:02:30.48358398 +0000 UTC m=+1034.998324988" watchObservedRunningTime="2025-10-06 07:02:30.485436917 +0000 UTC m=+1035.000177925" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.514774 4845 scope.go:117] "RemoveContainer" containerID="a3548ec736a947129ca3d614632af2fa0d86e4351c4a16563b2891f660f68b28" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.535095 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.547850 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.558345 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:02:30 crc kubenswrapper[4845]: E1006 07:02:30.558923 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74662789-933b-40b7-8e5b-2a3e32ba08f4" containerName="glance-log" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.558945 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="74662789-933b-40b7-8e5b-2a3e32ba08f4" containerName="glance-log" Oct 06 07:02:30 crc kubenswrapper[4845]: E1006 07:02:30.558959 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74662789-933b-40b7-8e5b-2a3e32ba08f4" containerName="glance-httpd" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.558977 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="74662789-933b-40b7-8e5b-2a3e32ba08f4" containerName="glance-httpd" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.559187 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="74662789-933b-40b7-8e5b-2a3e32ba08f4" containerName="glance-log" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.559231 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="74662789-933b-40b7-8e5b-2a3e32ba08f4" containerName="glance-httpd" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.560465 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.564509 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.564681 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.567008 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.652199 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.652254 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-logs\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.652281 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.652346 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.652394 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.652424 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g52cf\" (UniqueName: \"kubernetes.io/projected/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-kube-api-access-g52cf\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.652488 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.652512 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.726244 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.753850 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.753923 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.753970 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g52cf\" (UniqueName: \"kubernetes.io/projected/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-kube-api-access-g52cf\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.754030 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.754054 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.754096 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.754114 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-logs\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.754140 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.754902 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.760499 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.761125 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-logs\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.761292 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.761517 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.761889 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.775058 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.783524 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g52cf\" (UniqueName: \"kubernetes.io/projected/19ec12dd-d7b0-45e8-b569-887bbdf5b6fd-kube-api-access-g52cf\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.810768 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd\") " pod="openstack/glance-default-internal-api-0" Oct 06 07:02:30 crc kubenswrapper[4845]: I1006 07:02:30.880449 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:31 crc kubenswrapper[4845]: I1006 07:02:31.490018 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43802b75-eb4d-43b8-ab8c-0b84a7b604e7","Type":"ContainerStarted","Data":"3f017c112e13719a2aaf95a7cb72c46f4dd24cb81ab1bbfc17b19c5b96486c17"} Oct 06 07:02:31 crc kubenswrapper[4845]: I1006 07:02:31.493130 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 07:02:32 crc kubenswrapper[4845]: I1006 07:02:32.250278 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74662789-933b-40b7-8e5b-2a3e32ba08f4" path="/var/lib/kubelet/pods/74662789-933b-40b7-8e5b-2a3e32ba08f4/volumes" Oct 06 07:02:32 crc kubenswrapper[4845]: I1006 07:02:32.500475 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd","Type":"ContainerStarted","Data":"7086b6075c74437eca2b886764b32c88db9968f8ad405f1db9bf7125c12cf81f"} Oct 06 07:02:32 crc kubenswrapper[4845]: I1006 07:02:32.500521 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd","Type":"ContainerStarted","Data":"9d14abeb26c4f0e364300223fe66de743f5d1988aba631ccd38576af764569cc"} Oct 06 07:02:32 crc kubenswrapper[4845]: I1006 07:02:32.504635 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43802b75-eb4d-43b8-ab8c-0b84a7b604e7","Type":"ContainerStarted","Data":"baff192c6ab1a4af8d56e7cf6bc901b8212ccdbfbd402f1064a8e70abbdd3917"} Oct 06 07:02:32 crc kubenswrapper[4845]: I1006 07:02:32.504770 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="ceilometer-central-agent" containerID="cri-o://e29ec5199bbb33b06248779665371f082450f10773dd8eeedb3be1bc16f6305c" gracePeriod=30 Oct 06 07:02:32 crc kubenswrapper[4845]: I1006 07:02:32.504828 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="proxy-httpd" containerID="cri-o://baff192c6ab1a4af8d56e7cf6bc901b8212ccdbfbd402f1064a8e70abbdd3917" gracePeriod=30 Oct 06 07:02:32 crc kubenswrapper[4845]: I1006 07:02:32.504872 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="ceilometer-notification-agent" containerID="cri-o://322faeb01f6fe40be4c59a1d7b0a7238010e33e9d46a69510c6502a141568184" gracePeriod=30 Oct 06 07:02:32 crc kubenswrapper[4845]: I1006 07:02:32.504832 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 07:02:32 crc kubenswrapper[4845]: I1006 07:02:32.505198 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="sg-core" containerID="cri-o://3f017c112e13719a2aaf95a7cb72c46f4dd24cb81ab1bbfc17b19c5b96486c17" gracePeriod=30 Oct 06 07:02:32 crc kubenswrapper[4845]: I1006 07:02:32.532306 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.508077878 podStartE2EDuration="5.532237666s" podCreationTimestamp="2025-10-06 07:02:27 +0000 UTC" firstStartedPulling="2025-10-06 07:02:28.636968219 +0000 UTC m=+1033.151709227" lastFinishedPulling="2025-10-06 07:02:31.661128007 +0000 UTC m=+1036.175869015" observedRunningTime="2025-10-06 07:02:32.522824934 +0000 UTC m=+1037.037565942" watchObservedRunningTime="2025-10-06 07:02:32.532237666 +0000 UTC m=+1037.046978674" Oct 06 07:02:33 crc kubenswrapper[4845]: I1006 07:02:33.516472 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"19ec12dd-d7b0-45e8-b569-887bbdf5b6fd","Type":"ContainerStarted","Data":"f5e433eb0f7ed13ff72b6ee9abc124a25ba143310900aed7397a7969d3ed6ac0"} Oct 06 07:02:33 crc kubenswrapper[4845]: I1006 07:02:33.524438 4845 generic.go:334] "Generic (PLEG): container finished" podID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerID="baff192c6ab1a4af8d56e7cf6bc901b8212ccdbfbd402f1064a8e70abbdd3917" exitCode=0 Oct 06 07:02:33 crc kubenswrapper[4845]: I1006 07:02:33.524465 4845 generic.go:334] "Generic (PLEG): container finished" podID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerID="3f017c112e13719a2aaf95a7cb72c46f4dd24cb81ab1bbfc17b19c5b96486c17" exitCode=2 Oct 06 07:02:33 crc kubenswrapper[4845]: I1006 07:02:33.524472 4845 generic.go:334] "Generic (PLEG): container finished" podID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerID="322faeb01f6fe40be4c59a1d7b0a7238010e33e9d46a69510c6502a141568184" exitCode=0 Oct 06 07:02:33 crc kubenswrapper[4845]: I1006 07:02:33.524492 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43802b75-eb4d-43b8-ab8c-0b84a7b604e7","Type":"ContainerDied","Data":"baff192c6ab1a4af8d56e7cf6bc901b8212ccdbfbd402f1064a8e70abbdd3917"} Oct 06 07:02:33 crc kubenswrapper[4845]: I1006 07:02:33.524517 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43802b75-eb4d-43b8-ab8c-0b84a7b604e7","Type":"ContainerDied","Data":"3f017c112e13719a2aaf95a7cb72c46f4dd24cb81ab1bbfc17b19c5b96486c17"} Oct 06 07:02:33 crc kubenswrapper[4845]: I1006 07:02:33.524529 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43802b75-eb4d-43b8-ab8c-0b84a7b604e7","Type":"ContainerDied","Data":"322faeb01f6fe40be4c59a1d7b0a7238010e33e9d46a69510c6502a141568184"} Oct 06 07:02:33 crc kubenswrapper[4845]: I1006 07:02:33.547838 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.547818378 podStartE2EDuration="3.547818378s" podCreationTimestamp="2025-10-06 07:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:02:33.540771582 +0000 UTC m=+1038.055512610" watchObservedRunningTime="2025-10-06 07:02:33.547818378 +0000 UTC m=+1038.062559386" Oct 06 07:02:34 crc kubenswrapper[4845]: I1006 07:02:34.535558 4845 generic.go:334] "Generic (PLEG): container finished" podID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerID="e29ec5199bbb33b06248779665371f082450f10773dd8eeedb3be1bc16f6305c" exitCode=0 Oct 06 07:02:34 crc kubenswrapper[4845]: I1006 07:02:34.535628 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43802b75-eb4d-43b8-ab8c-0b84a7b604e7","Type":"ContainerDied","Data":"e29ec5199bbb33b06248779665371f082450f10773dd8eeedb3be1bc16f6305c"} Oct 06 07:02:35 crc kubenswrapper[4845]: I1006 07:02:35.963498 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 07:02:37 crc kubenswrapper[4845]: I1006 07:02:37.852044 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 07:02:37 crc kubenswrapper[4845]: I1006 07:02:37.852440 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 07:02:37 crc kubenswrapper[4845]: I1006 07:02:37.893663 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 07:02:37 crc kubenswrapper[4845]: I1006 07:02:37.901637 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 07:02:38 crc kubenswrapper[4845]: I1006 07:02:38.587370 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 07:02:38 crc kubenswrapper[4845]: I1006 07:02:38.587702 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.758706 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.848560 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv4hx\" (UniqueName: \"kubernetes.io/projected/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-kube-api-access-xv4hx\") pod \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.848633 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-combined-ca-bundle\") pod \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.848678 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-sg-core-conf-yaml\") pod \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.848748 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-scripts\") pod \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.848835 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-config-data\") pod \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.848882 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-log-httpd\") pod \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.848982 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-run-httpd\") pod \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\" (UID: \"43802b75-eb4d-43b8-ab8c-0b84a7b604e7\") " Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.849975 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "43802b75-eb4d-43b8-ab8c-0b84a7b604e7" (UID: "43802b75-eb4d-43b8-ab8c-0b84a7b604e7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.850424 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "43802b75-eb4d-43b8-ab8c-0b84a7b604e7" (UID: "43802b75-eb4d-43b8-ab8c-0b84a7b604e7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.858281 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-kube-api-access-xv4hx" (OuterVolumeSpecName: "kube-api-access-xv4hx") pod "43802b75-eb4d-43b8-ab8c-0b84a7b604e7" (UID: "43802b75-eb4d-43b8-ab8c-0b84a7b604e7"). InnerVolumeSpecName "kube-api-access-xv4hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.858485 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-scripts" (OuterVolumeSpecName: "scripts") pod "43802b75-eb4d-43b8-ab8c-0b84a7b604e7" (UID: "43802b75-eb4d-43b8-ab8c-0b84a7b604e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.878744 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "43802b75-eb4d-43b8-ab8c-0b84a7b604e7" (UID: "43802b75-eb4d-43b8-ab8c-0b84a7b604e7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.936094 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43802b75-eb4d-43b8-ab8c-0b84a7b604e7" (UID: "43802b75-eb4d-43b8-ab8c-0b84a7b604e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.942797 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-config-data" (OuterVolumeSpecName: "config-data") pod "43802b75-eb4d-43b8-ab8c-0b84a7b604e7" (UID: "43802b75-eb4d-43b8-ab8c-0b84a7b604e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.951006 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.951037 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.951046 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.951054 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv4hx\" (UniqueName: \"kubernetes.io/projected/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-kube-api-access-xv4hx\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.951066 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.951075 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:39 crc kubenswrapper[4845]: I1006 07:02:39.951082 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43802b75-eb4d-43b8-ab8c-0b84a7b604e7-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.606857 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43802b75-eb4d-43b8-ab8c-0b84a7b604e7","Type":"ContainerDied","Data":"334a0268555e8ef9a267cc3b13212d0c53300c3a69dbc1f5995d18519d893d88"} Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.607251 4845 scope.go:117] "RemoveContainer" containerID="baff192c6ab1a4af8d56e7cf6bc901b8212ccdbfbd402f1064a8e70abbdd3917" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.607111 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.627841 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.632662 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.632785 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.637088 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.640004 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.658683 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:40 crc kubenswrapper[4845]: E1006 07:02:40.659173 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="proxy-httpd" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.659199 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="proxy-httpd" Oct 06 07:02:40 crc kubenswrapper[4845]: E1006 07:02:40.659233 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="ceilometer-central-agent" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.659242 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="ceilometer-central-agent" Oct 06 07:02:40 crc kubenswrapper[4845]: E1006 07:02:40.659259 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="ceilometer-notification-agent" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.659268 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="ceilometer-notification-agent" Oct 06 07:02:40 crc kubenswrapper[4845]: E1006 07:02:40.659284 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="sg-core" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.659293 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="sg-core" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.659605 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="proxy-httpd" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.659705 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="ceilometer-central-agent" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.659724 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="ceilometer-notification-agent" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.659766 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" containerName="sg-core" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.662600 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.669156 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.675898 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.687093 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.764295 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr6w4\" (UniqueName: \"kubernetes.io/projected/c32d7168-61c8-4f90-8b85-e50e2c4b1593-kube-api-access-mr6w4\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.764360 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-config-data\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.764521 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.764715 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c32d7168-61c8-4f90-8b85-e50e2c4b1593-run-httpd\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.764740 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.764826 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c32d7168-61c8-4f90-8b85-e50e2c4b1593-log-httpd\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.764918 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-scripts\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.866937 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr6w4\" (UniqueName: \"kubernetes.io/projected/c32d7168-61c8-4f90-8b85-e50e2c4b1593-kube-api-access-mr6w4\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.867003 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.867023 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-config-data\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.867056 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c32d7168-61c8-4f90-8b85-e50e2c4b1593-run-httpd\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.867071 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.867098 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c32d7168-61c8-4f90-8b85-e50e2c4b1593-log-httpd\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.867129 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-scripts\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.867801 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c32d7168-61c8-4f90-8b85-e50e2c4b1593-run-httpd\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.870184 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c32d7168-61c8-4f90-8b85-e50e2c4b1593-log-httpd\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.872802 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.873194 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-config-data\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.875899 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-scripts\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.882110 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.882159 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.884962 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.893790 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr6w4\" (UniqueName: \"kubernetes.io/projected/c32d7168-61c8-4f90-8b85-e50e2c4b1593-kube-api-access-mr6w4\") pod \"ceilometer-0\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " pod="openstack/ceilometer-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.926530 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.929954 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:40 crc kubenswrapper[4845]: I1006 07:02:40.983137 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:41 crc kubenswrapper[4845]: I1006 07:02:41.288832 4845 scope.go:117] "RemoveContainer" containerID="3f017c112e13719a2aaf95a7cb72c46f4dd24cb81ab1bbfc17b19c5b96486c17" Oct 06 07:02:41 crc kubenswrapper[4845]: I1006 07:02:41.371681 4845 scope.go:117] "RemoveContainer" containerID="322faeb01f6fe40be4c59a1d7b0a7238010e33e9d46a69510c6502a141568184" Oct 06 07:02:41 crc kubenswrapper[4845]: I1006 07:02:41.399356 4845 scope.go:117] "RemoveContainer" containerID="e29ec5199bbb33b06248779665371f082450f10773dd8eeedb3be1bc16f6305c" Oct 06 07:02:41 crc kubenswrapper[4845]: I1006 07:02:41.617683 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:41 crc kubenswrapper[4845]: I1006 07:02:41.617728 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:41 crc kubenswrapper[4845]: I1006 07:02:41.831613 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:41 crc kubenswrapper[4845]: W1006 07:02:41.834365 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc32d7168_61c8_4f90_8b85_e50e2c4b1593.slice/crio-56050abb805bed325eb999769d86b21aba1e7bcd754c1f87b329c392344b9f14 WatchSource:0}: Error finding container 56050abb805bed325eb999769d86b21aba1e7bcd754c1f87b329c392344b9f14: Status 404 returned error can't find the container with id 56050abb805bed325eb999769d86b21aba1e7bcd754c1f87b329c392344b9f14 Oct 06 07:02:42 crc kubenswrapper[4845]: I1006 07:02:42.242404 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43802b75-eb4d-43b8-ab8c-0b84a7b604e7" path="/var/lib/kubelet/pods/43802b75-eb4d-43b8-ab8c-0b84a7b604e7/volumes" Oct 06 07:02:42 crc kubenswrapper[4845]: I1006 07:02:42.554777 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:42 crc kubenswrapper[4845]: I1006 07:02:42.628666 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ct7br" event={"ID":"18f578e1-f6a1-4986-99af-0ab3a17cef8a","Type":"ContainerStarted","Data":"e5aff9362e0cf383edc934cb47ab890d7da14b1ae0ef3dcef167fc52d69d4808"} Oct 06 07:02:42 crc kubenswrapper[4845]: I1006 07:02:42.632495 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c32d7168-61c8-4f90-8b85-e50e2c4b1593","Type":"ContainerStarted","Data":"30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326"} Oct 06 07:02:42 crc kubenswrapper[4845]: I1006 07:02:42.632548 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c32d7168-61c8-4f90-8b85-e50e2c4b1593","Type":"ContainerStarted","Data":"56050abb805bed325eb999769d86b21aba1e7bcd754c1f87b329c392344b9f14"} Oct 06 07:02:42 crc kubenswrapper[4845]: I1006 07:02:42.667720 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ct7br" podStartSLOduration=1.8231947480000001 podStartE2EDuration="14.667702237s" podCreationTimestamp="2025-10-06 07:02:28 +0000 UTC" firstStartedPulling="2025-10-06 07:02:29.265578983 +0000 UTC m=+1033.780319991" lastFinishedPulling="2025-10-06 07:02:42.110086472 +0000 UTC m=+1046.624827480" observedRunningTime="2025-10-06 07:02:42.64883076 +0000 UTC m=+1047.163571778" watchObservedRunningTime="2025-10-06 07:02:42.667702237 +0000 UTC m=+1047.182443245" Oct 06 07:02:43 crc kubenswrapper[4845]: I1006 07:02:43.641068 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c32d7168-61c8-4f90-8b85-e50e2c4b1593","Type":"ContainerStarted","Data":"fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455"} Oct 06 07:02:43 crc kubenswrapper[4845]: I1006 07:02:43.641132 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 07:02:43 crc kubenswrapper[4845]: I1006 07:02:43.641491 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 07:02:43 crc kubenswrapper[4845]: I1006 07:02:43.861857 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:44 crc kubenswrapper[4845]: I1006 07:02:44.032003 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 07:02:44 crc kubenswrapper[4845]: I1006 07:02:44.661934 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c32d7168-61c8-4f90-8b85-e50e2c4b1593","Type":"ContainerStarted","Data":"a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e"} Oct 06 07:02:45 crc kubenswrapper[4845]: I1006 07:02:45.673539 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c32d7168-61c8-4f90-8b85-e50e2c4b1593","Type":"ContainerStarted","Data":"9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f"} Oct 06 07:02:45 crc kubenswrapper[4845]: I1006 07:02:45.673857 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="ceilometer-central-agent" containerID="cri-o://30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326" gracePeriod=30 Oct 06 07:02:45 crc kubenswrapper[4845]: I1006 07:02:45.673900 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="proxy-httpd" containerID="cri-o://9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f" gracePeriod=30 Oct 06 07:02:45 crc kubenswrapper[4845]: I1006 07:02:45.673933 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="sg-core" containerID="cri-o://a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e" gracePeriod=30 Oct 06 07:02:45 crc kubenswrapper[4845]: I1006 07:02:45.673936 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="ceilometer-notification-agent" containerID="cri-o://fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455" gracePeriod=30 Oct 06 07:02:45 crc kubenswrapper[4845]: I1006 07:02:45.705229 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.28672979 podStartE2EDuration="5.705181141s" podCreationTimestamp="2025-10-06 07:02:40 +0000 UTC" firstStartedPulling="2025-10-06 07:02:41.837314668 +0000 UTC m=+1046.352055676" lastFinishedPulling="2025-10-06 07:02:45.255766019 +0000 UTC m=+1049.770507027" observedRunningTime="2025-10-06 07:02:45.70399646 +0000 UTC m=+1050.218737468" watchObservedRunningTime="2025-10-06 07:02:45.705181141 +0000 UTC m=+1050.219922149" Oct 06 07:02:46 crc kubenswrapper[4845]: I1006 07:02:46.693163 4845 generic.go:334] "Generic (PLEG): container finished" podID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerID="9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f" exitCode=0 Oct 06 07:02:46 crc kubenswrapper[4845]: I1006 07:02:46.693768 4845 generic.go:334] "Generic (PLEG): container finished" podID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerID="a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e" exitCode=2 Oct 06 07:02:46 crc kubenswrapper[4845]: I1006 07:02:46.693824 4845 generic.go:334] "Generic (PLEG): container finished" podID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerID="fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455" exitCode=0 Oct 06 07:02:46 crc kubenswrapper[4845]: I1006 07:02:46.693296 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c32d7168-61c8-4f90-8b85-e50e2c4b1593","Type":"ContainerDied","Data":"9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f"} Oct 06 07:02:46 crc kubenswrapper[4845]: I1006 07:02:46.693903 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c32d7168-61c8-4f90-8b85-e50e2c4b1593","Type":"ContainerDied","Data":"a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e"} Oct 06 07:02:46 crc kubenswrapper[4845]: I1006 07:02:46.693968 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c32d7168-61c8-4f90-8b85-e50e2c4b1593","Type":"ContainerDied","Data":"fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455"} Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.126847 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.277481 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-config-data\") pod \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.277854 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-scripts\") pod \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.277883 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr6w4\" (UniqueName: \"kubernetes.io/projected/c32d7168-61c8-4f90-8b85-e50e2c4b1593-kube-api-access-mr6w4\") pod \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.277903 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-combined-ca-bundle\") pod \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.277935 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-sg-core-conf-yaml\") pod \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.277985 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c32d7168-61c8-4f90-8b85-e50e2c4b1593-log-httpd\") pod \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.278198 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c32d7168-61c8-4f90-8b85-e50e2c4b1593-run-httpd\") pod \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\" (UID: \"c32d7168-61c8-4f90-8b85-e50e2c4b1593\") " Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.278486 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c32d7168-61c8-4f90-8b85-e50e2c4b1593-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c32d7168-61c8-4f90-8b85-e50e2c4b1593" (UID: "c32d7168-61c8-4f90-8b85-e50e2c4b1593"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.278748 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c32d7168-61c8-4f90-8b85-e50e2c4b1593-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.278761 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c32d7168-61c8-4f90-8b85-e50e2c4b1593-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c32d7168-61c8-4f90-8b85-e50e2c4b1593" (UID: "c32d7168-61c8-4f90-8b85-e50e2c4b1593"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.282457 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32d7168-61c8-4f90-8b85-e50e2c4b1593-kube-api-access-mr6w4" (OuterVolumeSpecName: "kube-api-access-mr6w4") pod "c32d7168-61c8-4f90-8b85-e50e2c4b1593" (UID: "c32d7168-61c8-4f90-8b85-e50e2c4b1593"). InnerVolumeSpecName "kube-api-access-mr6w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.282509 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-scripts" (OuterVolumeSpecName: "scripts") pod "c32d7168-61c8-4f90-8b85-e50e2c4b1593" (UID: "c32d7168-61c8-4f90-8b85-e50e2c4b1593"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.325595 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c32d7168-61c8-4f90-8b85-e50e2c4b1593" (UID: "c32d7168-61c8-4f90-8b85-e50e2c4b1593"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.347733 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c32d7168-61c8-4f90-8b85-e50e2c4b1593" (UID: "c32d7168-61c8-4f90-8b85-e50e2c4b1593"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.380873 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c32d7168-61c8-4f90-8b85-e50e2c4b1593-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.381063 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.381133 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr6w4\" (UniqueName: \"kubernetes.io/projected/c32d7168-61c8-4f90-8b85-e50e2c4b1593-kube-api-access-mr6w4\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.381225 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.381310 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.386989 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-config-data" (OuterVolumeSpecName: "config-data") pod "c32d7168-61c8-4f90-8b85-e50e2c4b1593" (UID: "c32d7168-61c8-4f90-8b85-e50e2c4b1593"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.482997 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c32d7168-61c8-4f90-8b85-e50e2c4b1593-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.742749 4845 generic.go:334] "Generic (PLEG): container finished" podID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerID="30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326" exitCode=0 Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.742792 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c32d7168-61c8-4f90-8b85-e50e2c4b1593","Type":"ContainerDied","Data":"30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326"} Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.742821 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c32d7168-61c8-4f90-8b85-e50e2c4b1593","Type":"ContainerDied","Data":"56050abb805bed325eb999769d86b21aba1e7bcd754c1f87b329c392344b9f14"} Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.742833 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.742855 4845 scope.go:117] "RemoveContainer" containerID="9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.778529 4845 scope.go:117] "RemoveContainer" containerID="a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.797955 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.807307 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.844290 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:51 crc kubenswrapper[4845]: E1006 07:02:51.844696 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="ceilometer-central-agent" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.844712 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="ceilometer-central-agent" Oct 06 07:02:51 crc kubenswrapper[4845]: E1006 07:02:51.844725 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="proxy-httpd" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.844732 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="proxy-httpd" Oct 06 07:02:51 crc kubenswrapper[4845]: E1006 07:02:51.844751 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="sg-core" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.844758 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="sg-core" Oct 06 07:02:51 crc kubenswrapper[4845]: E1006 07:02:51.844766 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="ceilometer-notification-agent" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.844772 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="ceilometer-notification-agent" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.844950 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="sg-core" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.844961 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="ceilometer-central-agent" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.844976 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="ceilometer-notification-agent" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.844989 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" containerName="proxy-httpd" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.846530 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.852527 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.853875 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.854014 4845 scope.go:117] "RemoveContainer" containerID="fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.854056 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.877469 4845 scope.go:117] "RemoveContainer" containerID="30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.905243 4845 scope.go:117] "RemoveContainer" containerID="9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f" Oct 06 07:02:51 crc kubenswrapper[4845]: E1006 07:02:51.906213 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f\": container with ID starting with 9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f not found: ID does not exist" containerID="9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.906273 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f"} err="failed to get container status \"9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f\": rpc error: code = NotFound desc = could not find container \"9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f\": container with ID starting with 9a35b0fb26b6b31415cc22a248a4ee0bc8eb7194ba9f032ae04e6e82abb2388f not found: ID does not exist" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.906303 4845 scope.go:117] "RemoveContainer" containerID="a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e" Oct 06 07:02:51 crc kubenswrapper[4845]: E1006 07:02:51.906725 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e\": container with ID starting with a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e not found: ID does not exist" containerID="a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.906766 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e"} err="failed to get container status \"a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e\": rpc error: code = NotFound desc = could not find container \"a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e\": container with ID starting with a9c44c6f7093ceac9a3f6592b37e60664f82b03794eb27e0c3a6c85bf975180e not found: ID does not exist" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.906796 4845 scope.go:117] "RemoveContainer" containerID="fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455" Oct 06 07:02:51 crc kubenswrapper[4845]: E1006 07:02:51.907067 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455\": container with ID starting with fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455 not found: ID does not exist" containerID="fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.907096 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455"} err="failed to get container status \"fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455\": rpc error: code = NotFound desc = could not find container \"fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455\": container with ID starting with fc8bb07f3feea10ca976fcbbb780f2268e71a2e31b56d7798c8c0dc5b34a7455 not found: ID does not exist" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.907115 4845 scope.go:117] "RemoveContainer" containerID="30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326" Oct 06 07:02:51 crc kubenswrapper[4845]: E1006 07:02:51.908202 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326\": container with ID starting with 30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326 not found: ID does not exist" containerID="30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.908234 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326"} err="failed to get container status \"30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326\": rpc error: code = NotFound desc = could not find container \"30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326\": container with ID starting with 30936a2d5de254889ceab69e2fe000af44966b72212799e2017d71a3a7c57326 not found: ID does not exist" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.997266 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.997330 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/297dd975-5214-490d-a596-42722d59c5a3-log-httpd\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.997471 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-config-data\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.997634 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/297dd975-5214-490d-a596-42722d59c5a3-run-httpd\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.997662 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-scripts\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.997721 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:51 crc kubenswrapper[4845]: I1006 07:02:51.997807 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbxpt\" (UniqueName: \"kubernetes.io/projected/297dd975-5214-490d-a596-42722d59c5a3-kube-api-access-cbxpt\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.100022 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/297dd975-5214-490d-a596-42722d59c5a3-run-httpd\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.100087 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-scripts\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.100118 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.100157 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbxpt\" (UniqueName: \"kubernetes.io/projected/297dd975-5214-490d-a596-42722d59c5a3-kube-api-access-cbxpt\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.100202 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.100244 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/297dd975-5214-490d-a596-42722d59c5a3-log-httpd\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.100276 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-config-data\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.100461 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/297dd975-5214-490d-a596-42722d59c5a3-run-httpd\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.101459 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/297dd975-5214-490d-a596-42722d59c5a3-log-httpd\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.104257 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.104859 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-config-data\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.104954 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.107500 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-scripts\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.114831 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbxpt\" (UniqueName: \"kubernetes.io/projected/297dd975-5214-490d-a596-42722d59c5a3-kube-api-access-cbxpt\") pod \"ceilometer-0\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.176185 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.241001 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c32d7168-61c8-4f90-8b85-e50e2c4b1593" path="/var/lib/kubelet/pods/c32d7168-61c8-4f90-8b85-e50e2c4b1593/volumes" Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.604222 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:02:52 crc kubenswrapper[4845]: I1006 07:02:52.754623 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"297dd975-5214-490d-a596-42722d59c5a3","Type":"ContainerStarted","Data":"17b70b14dd204739d80a45e21465e7ac35c17c6036e8460958084c69aeb183b1"} Oct 06 07:02:53 crc kubenswrapper[4845]: I1006 07:02:53.766658 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"297dd975-5214-490d-a596-42722d59c5a3","Type":"ContainerStarted","Data":"e272e8a9a1ce4260b16abab94201745db8528ed805fa227d0441aed4998bcfdd"} Oct 06 07:02:53 crc kubenswrapper[4845]: I1006 07:02:53.767267 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"297dd975-5214-490d-a596-42722d59c5a3","Type":"ContainerStarted","Data":"63c80eddb23dbda3d5ed0d26250eb5ce6f1eddbc0aec31719d0d651951454aeb"} Oct 06 07:02:54 crc kubenswrapper[4845]: I1006 07:02:54.775707 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"297dd975-5214-490d-a596-42722d59c5a3","Type":"ContainerStarted","Data":"6ed8a6cf4c9f639d31c13004eb3e0927b3b839f9d3c44c0beda079768726870a"} Oct 06 07:02:55 crc kubenswrapper[4845]: I1006 07:02:55.786546 4845 generic.go:334] "Generic (PLEG): container finished" podID="18f578e1-f6a1-4986-99af-0ab3a17cef8a" containerID="e5aff9362e0cf383edc934cb47ab890d7da14b1ae0ef3dcef167fc52d69d4808" exitCode=0 Oct 06 07:02:55 crc kubenswrapper[4845]: I1006 07:02:55.786634 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ct7br" event={"ID":"18f578e1-f6a1-4986-99af-0ab3a17cef8a","Type":"ContainerDied","Data":"e5aff9362e0cf383edc934cb47ab890d7da14b1ae0ef3dcef167fc52d69d4808"} Oct 06 07:02:55 crc kubenswrapper[4845]: I1006 07:02:55.790871 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"297dd975-5214-490d-a596-42722d59c5a3","Type":"ContainerStarted","Data":"7578f2c015dc605af12885d3129d99a7d626c0827944e79bff977c3fb32fd2e9"} Oct 06 07:02:55 crc kubenswrapper[4845]: I1006 07:02:55.791088 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 07:02:55 crc kubenswrapper[4845]: I1006 07:02:55.829585 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.397672176 podStartE2EDuration="4.829566128s" podCreationTimestamp="2025-10-06 07:02:51 +0000 UTC" firstStartedPulling="2025-10-06 07:02:52.606421169 +0000 UTC m=+1057.121162177" lastFinishedPulling="2025-10-06 07:02:55.038315121 +0000 UTC m=+1059.553056129" observedRunningTime="2025-10-06 07:02:55.818136077 +0000 UTC m=+1060.332877095" watchObservedRunningTime="2025-10-06 07:02:55.829566128 +0000 UTC m=+1060.344307136" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.150660 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.296146 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-config-data\") pod \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.296239 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-combined-ca-bundle\") pod \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.296305 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2c4d\" (UniqueName: \"kubernetes.io/projected/18f578e1-f6a1-4986-99af-0ab3a17cef8a-kube-api-access-s2c4d\") pod \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.296703 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-scripts\") pod \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\" (UID: \"18f578e1-f6a1-4986-99af-0ab3a17cef8a\") " Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.301769 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f578e1-f6a1-4986-99af-0ab3a17cef8a-kube-api-access-s2c4d" (OuterVolumeSpecName: "kube-api-access-s2c4d") pod "18f578e1-f6a1-4986-99af-0ab3a17cef8a" (UID: "18f578e1-f6a1-4986-99af-0ab3a17cef8a"). InnerVolumeSpecName "kube-api-access-s2c4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.302078 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-scripts" (OuterVolumeSpecName: "scripts") pod "18f578e1-f6a1-4986-99af-0ab3a17cef8a" (UID: "18f578e1-f6a1-4986-99af-0ab3a17cef8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.322045 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18f578e1-f6a1-4986-99af-0ab3a17cef8a" (UID: "18f578e1-f6a1-4986-99af-0ab3a17cef8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.326049 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-config-data" (OuterVolumeSpecName: "config-data") pod "18f578e1-f6a1-4986-99af-0ab3a17cef8a" (UID: "18f578e1-f6a1-4986-99af-0ab3a17cef8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.399821 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.399857 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.399870 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2c4d\" (UniqueName: \"kubernetes.io/projected/18f578e1-f6a1-4986-99af-0ab3a17cef8a-kube-api-access-s2c4d\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.399879 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f578e1-f6a1-4986-99af-0ab3a17cef8a-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.807267 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ct7br" event={"ID":"18f578e1-f6a1-4986-99af-0ab3a17cef8a","Type":"ContainerDied","Data":"d1d8e0679ac4f7c6cecbfb21878aeaeb544fe92093dbfeb2a1997f9cdd92a0c6"} Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.807305 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1d8e0679ac4f7c6cecbfb21878aeaeb544fe92093dbfeb2a1997f9cdd92a0c6" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.807335 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ct7br" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.902948 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 07:02:57 crc kubenswrapper[4845]: E1006 07:02:57.903391 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f578e1-f6a1-4986-99af-0ab3a17cef8a" containerName="nova-cell0-conductor-db-sync" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.903431 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f578e1-f6a1-4986-99af-0ab3a17cef8a" containerName="nova-cell0-conductor-db-sync" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.903667 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f578e1-f6a1-4986-99af-0ab3a17cef8a" containerName="nova-cell0-conductor-db-sync" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.904429 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.908982 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.910092 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tghcz" Oct 06 07:02:57 crc kubenswrapper[4845]: I1006 07:02:57.911716 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 07:02:58 crc kubenswrapper[4845]: I1006 07:02:58.010976 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jxk7\" (UniqueName: \"kubernetes.io/projected/5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be-kube-api-access-7jxk7\") pod \"nova-cell0-conductor-0\" (UID: \"5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be\") " pod="openstack/nova-cell0-conductor-0" Oct 06 07:02:58 crc kubenswrapper[4845]: I1006 07:02:58.011156 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be\") " pod="openstack/nova-cell0-conductor-0" Oct 06 07:02:58 crc kubenswrapper[4845]: I1006 07:02:58.011292 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be\") " pod="openstack/nova-cell0-conductor-0" Oct 06 07:02:58 crc kubenswrapper[4845]: I1006 07:02:58.113491 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be\") " pod="openstack/nova-cell0-conductor-0" Oct 06 07:02:58 crc kubenswrapper[4845]: I1006 07:02:58.114196 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be\") " pod="openstack/nova-cell0-conductor-0" Oct 06 07:02:58 crc kubenswrapper[4845]: I1006 07:02:58.114324 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jxk7\" (UniqueName: \"kubernetes.io/projected/5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be-kube-api-access-7jxk7\") pod \"nova-cell0-conductor-0\" (UID: \"5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be\") " pod="openstack/nova-cell0-conductor-0" Oct 06 07:02:58 crc kubenswrapper[4845]: I1006 07:02:58.118079 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be\") " pod="openstack/nova-cell0-conductor-0" Oct 06 07:02:58 crc kubenswrapper[4845]: I1006 07:02:58.123287 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be\") " pod="openstack/nova-cell0-conductor-0" Oct 06 07:02:58 crc kubenswrapper[4845]: I1006 07:02:58.137499 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jxk7\" (UniqueName: \"kubernetes.io/projected/5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be-kube-api-access-7jxk7\") pod \"nova-cell0-conductor-0\" (UID: \"5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be\") " pod="openstack/nova-cell0-conductor-0" Oct 06 07:02:58 crc kubenswrapper[4845]: I1006 07:02:58.276115 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 07:02:58 crc kubenswrapper[4845]: I1006 07:02:58.723319 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 07:02:58 crc kubenswrapper[4845]: I1006 07:02:58.825445 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be","Type":"ContainerStarted","Data":"03d1a853c8c39de2fe0a2f59b6c64b6f6700ce0e94f89d19c4c8bb17f7cef313"} Oct 06 07:02:59 crc kubenswrapper[4845]: I1006 07:02:59.834747 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be","Type":"ContainerStarted","Data":"4e81838038e9da23873802da84eb46d710709d512a0f0d447807ea0b40e0f951"} Oct 06 07:02:59 crc kubenswrapper[4845]: I1006 07:02:59.835081 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 06 07:02:59 crc kubenswrapper[4845]: I1006 07:02:59.853183 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.853159797 podStartE2EDuration="2.853159797s" podCreationTimestamp="2025-10-06 07:02:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:02:59.849032959 +0000 UTC m=+1064.363773977" watchObservedRunningTime="2025-10-06 07:02:59.853159797 +0000 UTC m=+1064.367900805" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.307708 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.740980 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-v4wmf"] Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.742170 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.744047 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.744616 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.766110 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-v4wmf"] Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.834715 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85d4x\" (UniqueName: \"kubernetes.io/projected/fc8f98cf-3856-45f2-9825-c49dfc2cf611-kube-api-access-85d4x\") pod \"nova-cell0-cell-mapping-v4wmf\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.835197 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-scripts\") pod \"nova-cell0-cell-mapping-v4wmf\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.835233 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-config-data\") pod \"nova-cell0-cell-mapping-v4wmf\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.835257 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v4wmf\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.916704 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.918770 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.922194 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.933966 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.940549 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85d4x\" (UniqueName: \"kubernetes.io/projected/fc8f98cf-3856-45f2-9825-c49dfc2cf611-kube-api-access-85d4x\") pod \"nova-cell0-cell-mapping-v4wmf\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.940600 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-scripts\") pod \"nova-cell0-cell-mapping-v4wmf\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.940619 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-config-data\") pod \"nova-cell0-cell-mapping-v4wmf\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.940634 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v4wmf\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.951510 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-scripts\") pod \"nova-cell0-cell-mapping-v4wmf\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.963258 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v4wmf\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.968680 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-config-data\") pod \"nova-cell0-cell-mapping-v4wmf\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:03 crc kubenswrapper[4845]: I1006 07:03:03.979016 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85d4x\" (UniqueName: \"kubernetes.io/projected/fc8f98cf-3856-45f2-9825-c49dfc2cf611-kube-api-access-85d4x\") pod \"nova-cell0-cell-mapping-v4wmf\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.047610 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d99c844-d63a-420b-a123-2f549734e048-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4d99c844-d63a-420b-a123-2f549734e048\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.047663 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k75j4\" (UniqueName: \"kubernetes.io/projected/4d99c844-d63a-420b-a123-2f549734e048-kube-api-access-k75j4\") pod \"nova-cell1-novncproxy-0\" (UID: \"4d99c844-d63a-420b-a123-2f549734e048\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.047832 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d99c844-d63a-420b-a123-2f549734e048-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4d99c844-d63a-420b-a123-2f549734e048\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.072920 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.074758 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.080881 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.082882 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.091351 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.150478 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d99c844-d63a-420b-a123-2f549734e048-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4d99c844-d63a-420b-a123-2f549734e048\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.150546 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k75j4\" (UniqueName: \"kubernetes.io/projected/4d99c844-d63a-420b-a123-2f549734e048-kube-api-access-k75j4\") pod \"nova-cell1-novncproxy-0\" (UID: \"4d99c844-d63a-420b-a123-2f549734e048\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.150573 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d99c844-d63a-420b-a123-2f549734e048-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4d99c844-d63a-420b-a123-2f549734e048\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.165231 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d99c844-d63a-420b-a123-2f549734e048-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4d99c844-d63a-420b-a123-2f549734e048\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.173306 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d99c844-d63a-420b-a123-2f549734e048-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4d99c844-d63a-420b-a123-2f549734e048\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.186307 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k75j4\" (UniqueName: \"kubernetes.io/projected/4d99c844-d63a-420b-a123-2f549734e048-kube-api-access-k75j4\") pod \"nova-cell1-novncproxy-0\" (UID: \"4d99c844-d63a-420b-a123-2f549734e048\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.216185 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.218054 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.226920 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.243281 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.259239 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-logs\") pod \"nova-metadata-0\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.259326 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-config-data\") pod \"nova-metadata-0\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.259442 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.259473 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ln2h\" (UniqueName: \"kubernetes.io/projected/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-kube-api-access-7ln2h\") pod \"nova-metadata-0\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.267744 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.269426 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.270545 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.273313 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.369201 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-config-data\") pod \"nova-metadata-0\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.380545 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3beeba2c-745b-472d-9a8b-152aed3c246b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3beeba2c-745b-472d-9a8b-152aed3c246b\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.380644 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-logs\") pod \"nova-api-0\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.380817 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3beeba2c-745b-472d-9a8b-152aed3c246b-config-data\") pod \"nova-scheduler-0\" (UID: \"3beeba2c-745b-472d-9a8b-152aed3c246b\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.381563 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.381624 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-config-data\") pod \"nova-api-0\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.381650 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ln2h\" (UniqueName: \"kubernetes.io/projected/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-kube-api-access-7ln2h\") pod \"nova-metadata-0\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.381747 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.381905 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb6zz\" (UniqueName: \"kubernetes.io/projected/3beeba2c-745b-472d-9a8b-152aed3c246b-kube-api-access-bb6zz\") pod \"nova-scheduler-0\" (UID: \"3beeba2c-745b-472d-9a8b-152aed3c246b\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.382559 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-logs\") pod \"nova-metadata-0\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.382645 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbxc5\" (UniqueName: \"kubernetes.io/projected/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-kube-api-access-mbxc5\") pod \"nova-api-0\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.386001 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.386331 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-logs\") pod \"nova-metadata-0\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.389905 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.403575 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-config-data\") pod \"nova-metadata-0\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.406481 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ln2h\" (UniqueName: \"kubernetes.io/projected/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-kube-api-access-7ln2h\") pod \"nova-metadata-0\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.442661 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66c996698c-llswm"] Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.445347 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.460626 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66c996698c-llswm"] Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.484160 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb6zz\" (UniqueName: \"kubernetes.io/projected/3beeba2c-745b-472d-9a8b-152aed3c246b-kube-api-access-bb6zz\") pod \"nova-scheduler-0\" (UID: \"3beeba2c-745b-472d-9a8b-152aed3c246b\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.484413 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbxc5\" (UniqueName: \"kubernetes.io/projected/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-kube-api-access-mbxc5\") pod \"nova-api-0\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.484512 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3beeba2c-745b-472d-9a8b-152aed3c246b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3beeba2c-745b-472d-9a8b-152aed3c246b\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.484603 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-logs\") pod \"nova-api-0\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.484705 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3beeba2c-745b-472d-9a8b-152aed3c246b-config-data\") pod \"nova-scheduler-0\" (UID: \"3beeba2c-745b-472d-9a8b-152aed3c246b\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.484800 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-config-data\") pod \"nova-api-0\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.484893 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.484899 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-logs\") pod \"nova-api-0\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.489917 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-config-data\") pod \"nova-api-0\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.490087 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3beeba2c-745b-472d-9a8b-152aed3c246b-config-data\") pod \"nova-scheduler-0\" (UID: \"3beeba2c-745b-472d-9a8b-152aed3c246b\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.490843 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.492359 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3beeba2c-745b-472d-9a8b-152aed3c246b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3beeba2c-745b-472d-9a8b-152aed3c246b\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.505225 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb6zz\" (UniqueName: \"kubernetes.io/projected/3beeba2c-745b-472d-9a8b-152aed3c246b-kube-api-access-bb6zz\") pod \"nova-scheduler-0\" (UID: \"3beeba2c-745b-472d-9a8b-152aed3c246b\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.505922 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbxc5\" (UniqueName: \"kubernetes.io/projected/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-kube-api-access-mbxc5\") pod \"nova-api-0\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.543129 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.586646 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-ovsdbserver-nb\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.586695 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-dns-svc\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.586724 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-ovsdbserver-sb\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.586748 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lhtf\" (UniqueName: \"kubernetes.io/projected/85319c4e-6277-4946-8fbd-aba39a453df8-kube-api-access-4lhtf\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.586832 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-config\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.586856 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-dns-swift-storage-0\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.587679 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.645890 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.688343 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-ovsdbserver-nb\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.688475 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-dns-svc\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.688504 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-ovsdbserver-sb\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.688928 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lhtf\" (UniqueName: \"kubernetes.io/projected/85319c4e-6277-4946-8fbd-aba39a453df8-kube-api-access-4lhtf\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.689574 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-ovsdbserver-nb\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.690170 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-ovsdbserver-sb\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.690269 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-dns-svc\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.693860 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-config\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.693898 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-dns-swift-storage-0\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.694681 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-dns-swift-storage-0\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.694769 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-config\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.720013 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lhtf\" (UniqueName: \"kubernetes.io/projected/85319c4e-6277-4946-8fbd-aba39a453df8-kube-api-access-4lhtf\") pod \"dnsmasq-dns-66c996698c-llswm\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.727828 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-v4wmf"] Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.785075 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.830171 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 07:03:04 crc kubenswrapper[4845]: W1006 07:03:04.847008 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d99c844_d63a_420b_a123_2f549734e048.slice/crio-5ea1d61a72e7b78d6c12f757260e6d4b972c3d714902d9db2980582edf37ba47 WatchSource:0}: Error finding container 5ea1d61a72e7b78d6c12f757260e6d4b972c3d714902d9db2980582edf37ba47: Status 404 returned error can't find the container with id 5ea1d61a72e7b78d6c12f757260e6d4b972c3d714902d9db2980582edf37ba47 Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.865523 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lck5q"] Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.866840 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.870639 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.872552 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.894260 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lck5q"] Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.909526 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v4wmf" event={"ID":"fc8f98cf-3856-45f2-9825-c49dfc2cf611","Type":"ContainerStarted","Data":"75815b5f00f8bea44cfc8f9cf7050966af131347a3092f22cc7fee69b04303b6"} Oct 06 07:03:04 crc kubenswrapper[4845]: I1006 07:03:04.913675 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4d99c844-d63a-420b-a123-2f549734e048","Type":"ContainerStarted","Data":"5ea1d61a72e7b78d6c12f757260e6d4b972c3d714902d9db2980582edf37ba47"} Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.002311 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvq5q\" (UniqueName: \"kubernetes.io/projected/69de2345-faa8-497d-8438-ac6a9d47f7e9-kube-api-access-cvq5q\") pod \"nova-cell1-conductor-db-sync-lck5q\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.002510 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lck5q\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.002557 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-config-data\") pod \"nova-cell1-conductor-db-sync-lck5q\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.002585 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-scripts\") pod \"nova-cell1-conductor-db-sync-lck5q\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.104706 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-scripts\") pod \"nova-cell1-conductor-db-sync-lck5q\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.104777 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvq5q\" (UniqueName: \"kubernetes.io/projected/69de2345-faa8-497d-8438-ac6a9d47f7e9-kube-api-access-cvq5q\") pod \"nova-cell1-conductor-db-sync-lck5q\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.105270 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lck5q\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.105348 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-config-data\") pod \"nova-cell1-conductor-db-sync-lck5q\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.110080 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-scripts\") pod \"nova-cell1-conductor-db-sync-lck5q\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.115934 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lck5q\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.117300 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-config-data\") pod \"nova-cell1-conductor-db-sync-lck5q\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: W1006 07:03:05.122064 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab4a8d9a_0e7e_445b_9ae8_e131319c34b8.slice/crio-1d6a7141f4240a6e5a2fa0d0f6d02d145a6d552a53ee54aa7ad34ecc2a350f13 WatchSource:0}: Error finding container 1d6a7141f4240a6e5a2fa0d0f6d02d145a6d552a53ee54aa7ad34ecc2a350f13: Status 404 returned error can't find the container with id 1d6a7141f4240a6e5a2fa0d0f6d02d145a6d552a53ee54aa7ad34ecc2a350f13 Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.123774 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.124145 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvq5q\" (UniqueName: \"kubernetes.io/projected/69de2345-faa8-497d-8438-ac6a9d47f7e9-kube-api-access-cvq5q\") pod \"nova-cell1-conductor-db-sync-lck5q\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.191698 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.200198 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.289808 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:03:05 crc kubenswrapper[4845]: W1006 07:03:05.311313 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3beeba2c_745b_472d_9a8b_152aed3c246b.slice/crio-c102a419b11fff1a4880da278b4fb790116bc356d01b5927f36148ac4a2c937f WatchSource:0}: Error finding container c102a419b11fff1a4880da278b4fb790116bc356d01b5927f36148ac4a2c937f: Status 404 returned error can't find the container with id c102a419b11fff1a4880da278b4fb790116bc356d01b5927f36148ac4a2c937f Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.366192 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66c996698c-llswm"] Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.773988 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lck5q"] Oct 06 07:03:05 crc kubenswrapper[4845]: W1006 07:03:05.786007 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69de2345_faa8_497d_8438_ac6a9d47f7e9.slice/crio-43d84168e0ed99586931d380541cfd48c11cd04460f4207ea8384d8825ca93a7 WatchSource:0}: Error finding container 43d84168e0ed99586931d380541cfd48c11cd04460f4207ea8384d8825ca93a7: Status 404 returned error can't find the container with id 43d84168e0ed99586931d380541cfd48c11cd04460f4207ea8384d8825ca93a7 Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.944757 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8","Type":"ContainerStarted","Data":"1d6a7141f4240a6e5a2fa0d0f6d02d145a6d552a53ee54aa7ad34ecc2a350f13"} Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.946760 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3beeba2c-745b-472d-9a8b-152aed3c246b","Type":"ContainerStarted","Data":"c102a419b11fff1a4880da278b4fb790116bc356d01b5927f36148ac4a2c937f"} Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.950797 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7","Type":"ContainerStarted","Data":"05345383f77cb40f9f7423193b3ab97454365b2789ed3afaf516ca8e26f64714"} Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.952484 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v4wmf" event={"ID":"fc8f98cf-3856-45f2-9825-c49dfc2cf611","Type":"ContainerStarted","Data":"dabbbf01e35174a9e6a0f846904f140b5cc24c9c6cd0e0d856d5a0c87630efbf"} Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.954850 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lck5q" event={"ID":"69de2345-faa8-497d-8438-ac6a9d47f7e9","Type":"ContainerStarted","Data":"43d84168e0ed99586931d380541cfd48c11cd04460f4207ea8384d8825ca93a7"} Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.957358 4845 generic.go:334] "Generic (PLEG): container finished" podID="85319c4e-6277-4946-8fbd-aba39a453df8" containerID="ed411d100b9b5b38d098c1a9ebdcf68acc0a25e4f7a537b7c34fe8b13a5ce08c" exitCode=0 Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.957432 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c996698c-llswm" event={"ID":"85319c4e-6277-4946-8fbd-aba39a453df8","Type":"ContainerDied","Data":"ed411d100b9b5b38d098c1a9ebdcf68acc0a25e4f7a537b7c34fe8b13a5ce08c"} Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.957473 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c996698c-llswm" event={"ID":"85319c4e-6277-4946-8fbd-aba39a453df8","Type":"ContainerStarted","Data":"867ed98228715d8a54162aba80790d82e8cccb2f75236953955fd9e080687116"} Oct 06 07:03:05 crc kubenswrapper[4845]: I1006 07:03:05.982932 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-v4wmf" podStartSLOduration=2.9829109689999997 podStartE2EDuration="2.982910969s" podCreationTimestamp="2025-10-06 07:03:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:03:05.976052968 +0000 UTC m=+1070.490793986" watchObservedRunningTime="2025-10-06 07:03:05.982910969 +0000 UTC m=+1070.497651977" Oct 06 07:03:06 crc kubenswrapper[4845]: I1006 07:03:06.971614 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c996698c-llswm" event={"ID":"85319c4e-6277-4946-8fbd-aba39a453df8","Type":"ContainerStarted","Data":"a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d"} Oct 06 07:03:06 crc kubenswrapper[4845]: I1006 07:03:06.972340 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:06 crc kubenswrapper[4845]: I1006 07:03:06.988084 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lck5q" event={"ID":"69de2345-faa8-497d-8438-ac6a9d47f7e9","Type":"ContainerStarted","Data":"66390510d358fc1297b9971709fb9694480c97c4bc8bc8f0041371d0d3708327"} Oct 06 07:03:06 crc kubenswrapper[4845]: I1006 07:03:06.999696 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66c996698c-llswm" podStartSLOduration=2.999677003 podStartE2EDuration="2.999677003s" podCreationTimestamp="2025-10-06 07:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:03:06.994667601 +0000 UTC m=+1071.509408629" watchObservedRunningTime="2025-10-06 07:03:06.999677003 +0000 UTC m=+1071.514418031" Oct 06 07:03:07 crc kubenswrapper[4845]: I1006 07:03:07.015946 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-lck5q" podStartSLOduration=3.015929742 podStartE2EDuration="3.015929742s" podCreationTimestamp="2025-10-06 07:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:03:07.009247186 +0000 UTC m=+1071.523988204" watchObservedRunningTime="2025-10-06 07:03:07.015929742 +0000 UTC m=+1071.530670750" Oct 06 07:03:07 crc kubenswrapper[4845]: I1006 07:03:07.729713 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:07 crc kubenswrapper[4845]: I1006 07:03:07.744612 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.004011 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7","Type":"ContainerStarted","Data":"9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb"} Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.004351 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7","Type":"ContainerStarted","Data":"3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db"} Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.008210 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4d99c844-d63a-420b-a123-2f549734e048","Type":"ContainerStarted","Data":"132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da"} Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.008417 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4d99c844-d63a-420b-a123-2f549734e048" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da" gracePeriod=30 Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.012866 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8","Type":"ContainerStarted","Data":"6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a"} Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.012936 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8","Type":"ContainerStarted","Data":"5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215"} Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.013118 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" containerName="nova-metadata-log" containerID="cri-o://5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215" gracePeriod=30 Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.013267 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" containerName="nova-metadata-metadata" containerID="cri-o://6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a" gracePeriod=30 Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.015090 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3beeba2c-745b-472d-9a8b-152aed3c246b","Type":"ContainerStarted","Data":"01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57"} Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.040807 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.287505992 podStartE2EDuration="5.040746519s" podCreationTimestamp="2025-10-06 07:03:04 +0000 UTC" firstStartedPulling="2025-10-06 07:03:05.223879813 +0000 UTC m=+1069.738620821" lastFinishedPulling="2025-10-06 07:03:07.97712034 +0000 UTC m=+1072.491861348" observedRunningTime="2025-10-06 07:03:09.037230317 +0000 UTC m=+1073.551971345" watchObservedRunningTime="2025-10-06 07:03:09.040746519 +0000 UTC m=+1073.555487527" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.067643 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.409132288 podStartE2EDuration="5.067623648s" podCreationTimestamp="2025-10-06 07:03:04 +0000 UTC" firstStartedPulling="2025-10-06 07:03:05.314658216 +0000 UTC m=+1069.829399224" lastFinishedPulling="2025-10-06 07:03:07.973149576 +0000 UTC m=+1072.487890584" observedRunningTime="2025-10-06 07:03:09.059697469 +0000 UTC m=+1073.574438497" watchObservedRunningTime="2025-10-06 07:03:09.067623648 +0000 UTC m=+1073.582364656" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.084034 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.973933932 podStartE2EDuration="6.0840146s" podCreationTimestamp="2025-10-06 07:03:03 +0000 UTC" firstStartedPulling="2025-10-06 07:03:04.863045897 +0000 UTC m=+1069.377786905" lastFinishedPulling="2025-10-06 07:03:07.973126565 +0000 UTC m=+1072.487867573" observedRunningTime="2025-10-06 07:03:09.079362468 +0000 UTC m=+1073.594103476" watchObservedRunningTime="2025-10-06 07:03:09.0840146 +0000 UTC m=+1073.598755608" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.105875 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.253461674 podStartE2EDuration="5.105857446s" podCreationTimestamp="2025-10-06 07:03:04 +0000 UTC" firstStartedPulling="2025-10-06 07:03:05.124523983 +0000 UTC m=+1069.639264991" lastFinishedPulling="2025-10-06 07:03:07.976919755 +0000 UTC m=+1072.491660763" observedRunningTime="2025-10-06 07:03:09.09650656 +0000 UTC m=+1073.611247578" watchObservedRunningTime="2025-10-06 07:03:09.105857446 +0000 UTC m=+1073.620598454" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.245283 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.551827 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.551871 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.630017 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.646511 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.734143 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-logs\") pod \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.734266 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-combined-ca-bundle\") pod \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.734364 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-config-data\") pod \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.734491 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ln2h\" (UniqueName: \"kubernetes.io/projected/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-kube-api-access-7ln2h\") pod \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\" (UID: \"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8\") " Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.735197 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-logs" (OuterVolumeSpecName: "logs") pod "ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" (UID: "ab4a8d9a-0e7e-445b-9ae8-e131319c34b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.742353 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-kube-api-access-7ln2h" (OuterVolumeSpecName: "kube-api-access-7ln2h") pod "ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" (UID: "ab4a8d9a-0e7e-445b-9ae8-e131319c34b8"). InnerVolumeSpecName "kube-api-access-7ln2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.763862 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" (UID: "ab4a8d9a-0e7e-445b-9ae8-e131319c34b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.767911 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-config-data" (OuterVolumeSpecName: "config-data") pod "ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" (UID: "ab4a8d9a-0e7e-445b-9ae8-e131319c34b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.836988 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.837446 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ln2h\" (UniqueName: \"kubernetes.io/projected/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-kube-api-access-7ln2h\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.837463 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:09 crc kubenswrapper[4845]: I1006 07:03:09.837475 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.029752 4845 generic.go:334] "Generic (PLEG): container finished" podID="ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" containerID="6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a" exitCode=0 Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.029800 4845 generic.go:334] "Generic (PLEG): container finished" podID="ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" containerID="5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215" exitCode=143 Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.030172 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8","Type":"ContainerDied","Data":"6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a"} Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.030234 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8","Type":"ContainerDied","Data":"5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215"} Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.030247 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab4a8d9a-0e7e-445b-9ae8-e131319c34b8","Type":"ContainerDied","Data":"1d6a7141f4240a6e5a2fa0d0f6d02d145a6d552a53ee54aa7ad34ecc2a350f13"} Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.030242 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.030264 4845 scope.go:117] "RemoveContainer" containerID="6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.088852 4845 scope.go:117] "RemoveContainer" containerID="5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.094394 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.122552 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.136801 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:10 crc kubenswrapper[4845]: E1006 07:03:10.137416 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" containerName="nova-metadata-metadata" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.137439 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" containerName="nova-metadata-metadata" Oct 06 07:03:10 crc kubenswrapper[4845]: E1006 07:03:10.137460 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" containerName="nova-metadata-log" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.137467 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" containerName="nova-metadata-log" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.140216 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" containerName="nova-metadata-log" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.140565 4845 scope.go:117] "RemoveContainer" containerID="6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.140788 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" containerName="nova-metadata-metadata" Oct 06 07:03:10 crc kubenswrapper[4845]: E1006 07:03:10.142194 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a\": container with ID starting with 6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a not found: ID does not exist" containerID="6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.142243 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a"} err="failed to get container status \"6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a\": rpc error: code = NotFound desc = could not find container \"6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a\": container with ID starting with 6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a not found: ID does not exist" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.142274 4845 scope.go:117] "RemoveContainer" containerID="5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215" Oct 06 07:03:10 crc kubenswrapper[4845]: E1006 07:03:10.142737 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215\": container with ID starting with 5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215 not found: ID does not exist" containerID="5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.142770 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215"} err="failed to get container status \"5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215\": rpc error: code = NotFound desc = could not find container \"5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215\": container with ID starting with 5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215 not found: ID does not exist" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.142788 4845 scope.go:117] "RemoveContainer" containerID="6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.144256 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a"} err="failed to get container status \"6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a\": rpc error: code = NotFound desc = could not find container \"6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a\": container with ID starting with 6bab1a57a29ca303ead68797c4340712666aeb9b83d80bc29695b53779193b7a not found: ID does not exist" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.144290 4845 scope.go:117] "RemoveContainer" containerID="5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.147193 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215"} err="failed to get container status \"5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215\": rpc error: code = NotFound desc = could not find container \"5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215\": container with ID starting with 5ba97b91a5c43b5e77ab2373592126e66ae34b5fff23af271e27594d30875215 not found: ID does not exist" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.147453 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.147549 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.151922 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.155964 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.238871 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab4a8d9a-0e7e-445b-9ae8-e131319c34b8" path="/var/lib/kubelet/pods/ab4a8d9a-0e7e-445b-9ae8-e131319c34b8/volumes" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.247267 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.247474 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-config-data\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.247526 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-logs\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.247543 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrcxx\" (UniqueName: \"kubernetes.io/projected/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-kube-api-access-wrcxx\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.247633 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.349828 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-config-data\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.349955 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-logs\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.349977 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrcxx\" (UniqueName: \"kubernetes.io/projected/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-kube-api-access-wrcxx\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.350004 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.350107 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.351999 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-logs\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.354590 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.355741 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-config-data\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.357272 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.388105 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrcxx\" (UniqueName: \"kubernetes.io/projected/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-kube-api-access-wrcxx\") pod \"nova-metadata-0\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.485138 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:03:10 crc kubenswrapper[4845]: I1006 07:03:10.930119 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:10 crc kubenswrapper[4845]: W1006 07:03:10.935681 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4e37ede_22f6_42b4_8f7d_8c6cafc9fe36.slice/crio-b31c6e019be117146ec796b35018bc4a25e30ec8b3db06b71f4d2178e854fa87 WatchSource:0}: Error finding container b31c6e019be117146ec796b35018bc4a25e30ec8b3db06b71f4d2178e854fa87: Status 404 returned error can't find the container with id b31c6e019be117146ec796b35018bc4a25e30ec8b3db06b71f4d2178e854fa87 Oct 06 07:03:11 crc kubenswrapper[4845]: I1006 07:03:11.041012 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36","Type":"ContainerStarted","Data":"b31c6e019be117146ec796b35018bc4a25e30ec8b3db06b71f4d2178e854fa87"} Oct 06 07:03:12 crc kubenswrapper[4845]: I1006 07:03:12.054349 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36","Type":"ContainerStarted","Data":"b9734cbd7007b7dc6840238688fe114f567cc660cb09a03d34a60686f010ebca"} Oct 06 07:03:12 crc kubenswrapper[4845]: I1006 07:03:12.054706 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36","Type":"ContainerStarted","Data":"21e9738c1c6b8688cb3943f0fefdbf33913748e44102a5ddee9c00a74e880eba"} Oct 06 07:03:12 crc kubenswrapper[4845]: I1006 07:03:12.089397 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.089361737 podStartE2EDuration="2.089361737s" podCreationTimestamp="2025-10-06 07:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:03:12.078019728 +0000 UTC m=+1076.592760756" watchObservedRunningTime="2025-10-06 07:03:12.089361737 +0000 UTC m=+1076.604102745" Oct 06 07:03:13 crc kubenswrapper[4845]: I1006 07:03:13.066047 4845 generic.go:334] "Generic (PLEG): container finished" podID="fc8f98cf-3856-45f2-9825-c49dfc2cf611" containerID="dabbbf01e35174a9e6a0f846904f140b5cc24c9c6cd0e0d856d5a0c87630efbf" exitCode=0 Oct 06 07:03:13 crc kubenswrapper[4845]: I1006 07:03:13.066151 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v4wmf" event={"ID":"fc8f98cf-3856-45f2-9825-c49dfc2cf611","Type":"ContainerDied","Data":"dabbbf01e35174a9e6a0f846904f140b5cc24c9c6cd0e0d856d5a0c87630efbf"} Oct 06 07:03:13 crc kubenswrapper[4845]: I1006 07:03:13.068625 4845 generic.go:334] "Generic (PLEG): container finished" podID="69de2345-faa8-497d-8438-ac6a9d47f7e9" containerID="66390510d358fc1297b9971709fb9694480c97c4bc8bc8f0041371d0d3708327" exitCode=0 Oct 06 07:03:13 crc kubenswrapper[4845]: I1006 07:03:13.068712 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lck5q" event={"ID":"69de2345-faa8-497d-8438-ac6a9d47f7e9","Type":"ContainerDied","Data":"66390510d358fc1297b9971709fb9694480c97c4bc8bc8f0041371d0d3708327"} Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.584601 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.588106 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.588163 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.590923 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.646206 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.670645 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-scripts\") pod \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.670695 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-scripts\") pod \"69de2345-faa8-497d-8438-ac6a9d47f7e9\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.670747 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvq5q\" (UniqueName: \"kubernetes.io/projected/69de2345-faa8-497d-8438-ac6a9d47f7e9-kube-api-access-cvq5q\") pod \"69de2345-faa8-497d-8438-ac6a9d47f7e9\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.670782 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-combined-ca-bundle\") pod \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.670887 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-config-data\") pod \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.670931 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-config-data\") pod \"69de2345-faa8-497d-8438-ac6a9d47f7e9\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.670993 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85d4x\" (UniqueName: \"kubernetes.io/projected/fc8f98cf-3856-45f2-9825-c49dfc2cf611-kube-api-access-85d4x\") pod \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\" (UID: \"fc8f98cf-3856-45f2-9825-c49dfc2cf611\") " Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.671053 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-combined-ca-bundle\") pod \"69de2345-faa8-497d-8438-ac6a9d47f7e9\" (UID: \"69de2345-faa8-497d-8438-ac6a9d47f7e9\") " Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.688606 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-scripts" (OuterVolumeSpecName: "scripts") pod "fc8f98cf-3856-45f2-9825-c49dfc2cf611" (UID: "fc8f98cf-3856-45f2-9825-c49dfc2cf611"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.688688 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69de2345-faa8-497d-8438-ac6a9d47f7e9-kube-api-access-cvq5q" (OuterVolumeSpecName: "kube-api-access-cvq5q") pod "69de2345-faa8-497d-8438-ac6a9d47f7e9" (UID: "69de2345-faa8-497d-8438-ac6a9d47f7e9"). InnerVolumeSpecName "kube-api-access-cvq5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.688784 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8f98cf-3856-45f2-9825-c49dfc2cf611-kube-api-access-85d4x" (OuterVolumeSpecName: "kube-api-access-85d4x") pod "fc8f98cf-3856-45f2-9825-c49dfc2cf611" (UID: "fc8f98cf-3856-45f2-9825-c49dfc2cf611"). InnerVolumeSpecName "kube-api-access-85d4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.688792 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.707882 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-scripts" (OuterVolumeSpecName: "scripts") pod "69de2345-faa8-497d-8438-ac6a9d47f7e9" (UID: "69de2345-faa8-497d-8438-ac6a9d47f7e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.718624 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69de2345-faa8-497d-8438-ac6a9d47f7e9" (UID: "69de2345-faa8-497d-8438-ac6a9d47f7e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.725008 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc8f98cf-3856-45f2-9825-c49dfc2cf611" (UID: "fc8f98cf-3856-45f2-9825-c49dfc2cf611"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.737095 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-config-data" (OuterVolumeSpecName: "config-data") pod "69de2345-faa8-497d-8438-ac6a9d47f7e9" (UID: "69de2345-faa8-497d-8438-ac6a9d47f7e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.744780 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-config-data" (OuterVolumeSpecName: "config-data") pod "fc8f98cf-3856-45f2-9825-c49dfc2cf611" (UID: "fc8f98cf-3856-45f2-9825-c49dfc2cf611"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.773321 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.773359 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85d4x\" (UniqueName: \"kubernetes.io/projected/fc8f98cf-3856-45f2-9825-c49dfc2cf611-kube-api-access-85d4x\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.773386 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.773401 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.773414 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69de2345-faa8-497d-8438-ac6a9d47f7e9-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.773426 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvq5q\" (UniqueName: \"kubernetes.io/projected/69de2345-faa8-497d-8438-ac6a9d47f7e9-kube-api-access-cvq5q\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.773437 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.773448 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8f98cf-3856-45f2-9825-c49dfc2cf611-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.788618 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.843363 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66c766bfdf-qf7h6"] Oct 06 07:03:14 crc kubenswrapper[4845]: I1006 07:03:14.843615 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" podUID="967c80f8-047c-4c0a-a81f-25b6741caf0a" containerName="dnsmasq-dns" containerID="cri-o://c5b80cc68c9a4f10f90feb1440ad39a8b9cb5d60226fbc79ace467f7f8c0b17a" gracePeriod=10 Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.097014 4845 generic.go:334] "Generic (PLEG): container finished" podID="967c80f8-047c-4c0a-a81f-25b6741caf0a" containerID="c5b80cc68c9a4f10f90feb1440ad39a8b9cb5d60226fbc79ace467f7f8c0b17a" exitCode=0 Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.097096 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" event={"ID":"967c80f8-047c-4c0a-a81f-25b6741caf0a","Type":"ContainerDied","Data":"c5b80cc68c9a4f10f90feb1440ad39a8b9cb5d60226fbc79ace467f7f8c0b17a"} Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.101713 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v4wmf" event={"ID":"fc8f98cf-3856-45f2-9825-c49dfc2cf611","Type":"ContainerDied","Data":"75815b5f00f8bea44cfc8f9cf7050966af131347a3092f22cc7fee69b04303b6"} Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.101745 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75815b5f00f8bea44cfc8f9cf7050966af131347a3092f22cc7fee69b04303b6" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.101809 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v4wmf" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.121307 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lck5q" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.121700 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lck5q" event={"ID":"69de2345-faa8-497d-8438-ac6a9d47f7e9","Type":"ContainerDied","Data":"43d84168e0ed99586931d380541cfd48c11cd04460f4207ea8384d8825ca93a7"} Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.121727 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d84168e0ed99586931d380541cfd48c11cd04460f4207ea8384d8825ca93a7" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.170137 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 07:03:15 crc kubenswrapper[4845]: E1006 07:03:15.171178 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8f98cf-3856-45f2-9825-c49dfc2cf611" containerName="nova-manage" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.171193 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8f98cf-3856-45f2-9825-c49dfc2cf611" containerName="nova-manage" Oct 06 07:03:15 crc kubenswrapper[4845]: E1006 07:03:15.171230 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69de2345-faa8-497d-8438-ac6a9d47f7e9" containerName="nova-cell1-conductor-db-sync" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.171236 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="69de2345-faa8-497d-8438-ac6a9d47f7e9" containerName="nova-cell1-conductor-db-sync" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.171465 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="69de2345-faa8-497d-8438-ac6a9d47f7e9" containerName="nova-cell1-conductor-db-sync" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.171490 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc8f98cf-3856-45f2-9825-c49dfc2cf611" containerName="nova-manage" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.172259 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.177119 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.191327 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.215790 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.268282 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.268552 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" containerName="nova-api-log" containerID="cri-o://3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db" gracePeriod=30 Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.268666 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" containerName="nova-api-api" containerID="cri-o://9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb" gracePeriod=30 Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.275716 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": EOF" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.276520 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": EOF" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.284519 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6ea194-3e38-44a9-9ac4-0182b588cee2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"db6ea194-3e38-44a9-9ac4-0182b588cee2\") " pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.284590 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99svw\" (UniqueName: \"kubernetes.io/projected/db6ea194-3e38-44a9-9ac4-0182b588cee2-kube-api-access-99svw\") pod \"nova-cell1-conductor-0\" (UID: \"db6ea194-3e38-44a9-9ac4-0182b588cee2\") " pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.284639 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db6ea194-3e38-44a9-9ac4-0182b588cee2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"db6ea194-3e38-44a9-9ac4-0182b588cee2\") " pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.293705 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.305520 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.314476 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.315652 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" containerName="nova-metadata-log" containerID="cri-o://21e9738c1c6b8688cb3943f0fefdbf33913748e44102a5ddee9c00a74e880eba" gracePeriod=30 Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.315808 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" containerName="nova-metadata-metadata" containerID="cri-o://b9734cbd7007b7dc6840238688fe114f567cc660cb09a03d34a60686f010ebca" gracePeriod=30 Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.389538 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-dns-swift-storage-0\") pod \"967c80f8-047c-4c0a-a81f-25b6741caf0a\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.389637 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-config\") pod \"967c80f8-047c-4c0a-a81f-25b6741caf0a\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.389764 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-ovsdbserver-sb\") pod \"967c80f8-047c-4c0a-a81f-25b6741caf0a\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.389890 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mdzs\" (UniqueName: \"kubernetes.io/projected/967c80f8-047c-4c0a-a81f-25b6741caf0a-kube-api-access-2mdzs\") pod \"967c80f8-047c-4c0a-a81f-25b6741caf0a\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.389958 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-dns-svc\") pod \"967c80f8-047c-4c0a-a81f-25b6741caf0a\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.390003 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-ovsdbserver-nb\") pod \"967c80f8-047c-4c0a-a81f-25b6741caf0a\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.390224 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99svw\" (UniqueName: \"kubernetes.io/projected/db6ea194-3e38-44a9-9ac4-0182b588cee2-kube-api-access-99svw\") pod \"nova-cell1-conductor-0\" (UID: \"db6ea194-3e38-44a9-9ac4-0182b588cee2\") " pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.390307 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db6ea194-3e38-44a9-9ac4-0182b588cee2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"db6ea194-3e38-44a9-9ac4-0182b588cee2\") " pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.390473 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6ea194-3e38-44a9-9ac4-0182b588cee2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"db6ea194-3e38-44a9-9ac4-0182b588cee2\") " pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.395357 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db6ea194-3e38-44a9-9ac4-0182b588cee2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"db6ea194-3e38-44a9-9ac4-0182b588cee2\") " pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.398058 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/967c80f8-047c-4c0a-a81f-25b6741caf0a-kube-api-access-2mdzs" (OuterVolumeSpecName: "kube-api-access-2mdzs") pod "967c80f8-047c-4c0a-a81f-25b6741caf0a" (UID: "967c80f8-047c-4c0a-a81f-25b6741caf0a"). InnerVolumeSpecName "kube-api-access-2mdzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.399799 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6ea194-3e38-44a9-9ac4-0182b588cee2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"db6ea194-3e38-44a9-9ac4-0182b588cee2\") " pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.416787 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99svw\" (UniqueName: \"kubernetes.io/projected/db6ea194-3e38-44a9-9ac4-0182b588cee2-kube-api-access-99svw\") pod \"nova-cell1-conductor-0\" (UID: \"db6ea194-3e38-44a9-9ac4-0182b588cee2\") " pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.452795 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "967c80f8-047c-4c0a-a81f-25b6741caf0a" (UID: "967c80f8-047c-4c0a-a81f-25b6741caf0a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.457830 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "967c80f8-047c-4c0a-a81f-25b6741caf0a" (UID: "967c80f8-047c-4c0a-a81f-25b6741caf0a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.462957 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "967c80f8-047c-4c0a-a81f-25b6741caf0a" (UID: "967c80f8-047c-4c0a-a81f-25b6741caf0a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.481201 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "967c80f8-047c-4c0a-a81f-25b6741caf0a" (UID: "967c80f8-047c-4c0a-a81f-25b6741caf0a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.486105 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.486157 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.492411 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-config" (OuterVolumeSpecName: "config") pod "967c80f8-047c-4c0a-a81f-25b6741caf0a" (UID: "967c80f8-047c-4c0a-a81f-25b6741caf0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.492994 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-config\") pod \"967c80f8-047c-4c0a-a81f-25b6741caf0a\" (UID: \"967c80f8-047c-4c0a-a81f-25b6741caf0a\") " Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.493404 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.493423 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mdzs\" (UniqueName: \"kubernetes.io/projected/967c80f8-047c-4c0a-a81f-25b6741caf0a-kube-api-access-2mdzs\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.493435 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.493445 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.493453 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:15 crc kubenswrapper[4845]: W1006 07:03:15.493532 4845 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/967c80f8-047c-4c0a-a81f-25b6741caf0a/volumes/kubernetes.io~configmap/config Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.493544 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-config" (OuterVolumeSpecName: "config") pod "967c80f8-047c-4c0a-a81f-25b6741caf0a" (UID: "967c80f8-047c-4c0a-a81f-25b6741caf0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.531250 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:15 crc kubenswrapper[4845]: I1006 07:03:15.594616 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/967c80f8-047c-4c0a-a81f-25b6741caf0a-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.022750 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 07:03:16 crc kubenswrapper[4845]: W1006 07:03:16.030433 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb6ea194_3e38_44a9_9ac4_0182b588cee2.slice/crio-ef972697979a36280c74ccd538997990b5971729c1e8b16d889f119b16df91d3 WatchSource:0}: Error finding container ef972697979a36280c74ccd538997990b5971729c1e8b16d889f119b16df91d3: Status 404 returned error can't find the container with id ef972697979a36280c74ccd538997990b5971729c1e8b16d889f119b16df91d3 Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.139229 4845 generic.go:334] "Generic (PLEG): container finished" podID="a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" containerID="b9734cbd7007b7dc6840238688fe114f567cc660cb09a03d34a60686f010ebca" exitCode=0 Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.139278 4845 generic.go:334] "Generic (PLEG): container finished" podID="a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" containerID="21e9738c1c6b8688cb3943f0fefdbf33913748e44102a5ddee9c00a74e880eba" exitCode=143 Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.139318 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36","Type":"ContainerDied","Data":"b9734cbd7007b7dc6840238688fe114f567cc660cb09a03d34a60686f010ebca"} Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.139356 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36","Type":"ContainerDied","Data":"21e9738c1c6b8688cb3943f0fefdbf33913748e44102a5ddee9c00a74e880eba"} Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.141413 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" event={"ID":"967c80f8-047c-4c0a-a81f-25b6741caf0a","Type":"ContainerDied","Data":"05016e4f1893d7e87e3cf77c6dfe80abee78a39d5e6c10d66414a86b723799d9"} Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.141496 4845 scope.go:117] "RemoveContainer" containerID="c5b80cc68c9a4f10f90feb1440ad39a8b9cb5d60226fbc79ace467f7f8c0b17a" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.141712 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c766bfdf-qf7h6" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.152943 4845 generic.go:334] "Generic (PLEG): container finished" podID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" containerID="3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db" exitCode=143 Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.153038 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7","Type":"ContainerDied","Data":"3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db"} Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.157658 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"db6ea194-3e38-44a9-9ac4-0182b588cee2","Type":"ContainerStarted","Data":"ef972697979a36280c74ccd538997990b5971729c1e8b16d889f119b16df91d3"} Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.219712 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66c766bfdf-qf7h6"] Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.222600 4845 scope.go:117] "RemoveContainer" containerID="c9d77cc51870e479df0ac073de335e1c640abf9e4f1ca59f987fad57c6168349" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.245416 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66c766bfdf-qf7h6"] Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.314729 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.410030 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-logs\") pod \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.410143 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrcxx\" (UniqueName: \"kubernetes.io/projected/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-kube-api-access-wrcxx\") pod \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.410196 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-config-data\") pod \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.410234 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-nova-metadata-tls-certs\") pod \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.410313 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-combined-ca-bundle\") pod \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\" (UID: \"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36\") " Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.411613 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-logs" (OuterVolumeSpecName: "logs") pod "a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" (UID: "a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.416077 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-kube-api-access-wrcxx" (OuterVolumeSpecName: "kube-api-access-wrcxx") pod "a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" (UID: "a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36"). InnerVolumeSpecName "kube-api-access-wrcxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.439919 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-config-data" (OuterVolumeSpecName: "config-data") pod "a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" (UID: "a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.456424 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" (UID: "a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.462828 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" (UID: "a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.512280 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.512310 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.512322 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrcxx\" (UniqueName: \"kubernetes.io/projected/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-kube-api-access-wrcxx\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.512361 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:16 crc kubenswrapper[4845]: I1006 07:03:16.512414 4845 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.170600 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36","Type":"ContainerDied","Data":"b31c6e019be117146ec796b35018bc4a25e30ec8b3db06b71f4d2178e854fa87"} Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.172055 4845 scope.go:117] "RemoveContainer" containerID="b9734cbd7007b7dc6840238688fe114f567cc660cb09a03d34a60686f010ebca" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.170688 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.174562 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3beeba2c-745b-472d-9a8b-152aed3c246b" containerName="nova-scheduler-scheduler" containerID="cri-o://01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57" gracePeriod=30 Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.174796 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"db6ea194-3e38-44a9-9ac4-0182b588cee2","Type":"ContainerStarted","Data":"9517ae7ea2b788010f2103e36b7b0a073f4734a026fe6ec77de5acfbd6281408"} Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.174952 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.200653 4845 scope.go:117] "RemoveContainer" containerID="21e9738c1c6b8688cb3943f0fefdbf33913748e44102a5ddee9c00a74e880eba" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.205216 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.20518885 podStartE2EDuration="2.20518885s" podCreationTimestamp="2025-10-06 07:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:03:17.197263781 +0000 UTC m=+1081.712004789" watchObservedRunningTime="2025-10-06 07:03:17.20518885 +0000 UTC m=+1081.719929858" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.222066 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.238719 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.247422 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:17 crc kubenswrapper[4845]: E1006 07:03:17.248224 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="967c80f8-047c-4c0a-a81f-25b6741caf0a" containerName="init" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.250699 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="967c80f8-047c-4c0a-a81f-25b6741caf0a" containerName="init" Oct 06 07:03:17 crc kubenswrapper[4845]: E1006 07:03:17.250914 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" containerName="nova-metadata-log" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.251005 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" containerName="nova-metadata-log" Oct 06 07:03:17 crc kubenswrapper[4845]: E1006 07:03:17.251103 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" containerName="nova-metadata-metadata" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.251251 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" containerName="nova-metadata-metadata" Oct 06 07:03:17 crc kubenswrapper[4845]: E1006 07:03:17.251385 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="967c80f8-047c-4c0a-a81f-25b6741caf0a" containerName="dnsmasq-dns" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.251484 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="967c80f8-047c-4c0a-a81f-25b6741caf0a" containerName="dnsmasq-dns" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.252150 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" containerName="nova-metadata-log" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.252247 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" containerName="nova-metadata-metadata" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.252332 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="967c80f8-047c-4c0a-a81f-25b6741caf0a" containerName="dnsmasq-dns" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.254221 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.260522 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.260743 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.261594 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.325367 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.325460 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14928035-28fc-46c2-ade7-cc9f24cd0660-logs\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.325543 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.325567 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rck7h\" (UniqueName: \"kubernetes.io/projected/14928035-28fc-46c2-ade7-cc9f24cd0660-kube-api-access-rck7h\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.325639 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-config-data\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.427569 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.427641 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rck7h\" (UniqueName: \"kubernetes.io/projected/14928035-28fc-46c2-ade7-cc9f24cd0660-kube-api-access-rck7h\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.427720 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-config-data\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.427748 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.427817 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14928035-28fc-46c2-ade7-cc9f24cd0660-logs\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.428809 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14928035-28fc-46c2-ade7-cc9f24cd0660-logs\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.443793 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.445997 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rck7h\" (UniqueName: \"kubernetes.io/projected/14928035-28fc-46c2-ade7-cc9f24cd0660-kube-api-access-rck7h\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.448082 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.451495 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-config-data\") pod \"nova-metadata-0\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.571687 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:03:17 crc kubenswrapper[4845]: I1006 07:03:17.988896 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:18 crc kubenswrapper[4845]: I1006 07:03:18.186704 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14928035-28fc-46c2-ade7-cc9f24cd0660","Type":"ContainerStarted","Data":"4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f"} Oct 06 07:03:18 crc kubenswrapper[4845]: I1006 07:03:18.186748 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14928035-28fc-46c2-ade7-cc9f24cd0660","Type":"ContainerStarted","Data":"d7a7fffe9f51e0854895fc8cce331c1a94f9ebd90d5c4e16c3c9dd11e10d2667"} Oct 06 07:03:18 crc kubenswrapper[4845]: I1006 07:03:18.242202 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="967c80f8-047c-4c0a-a81f-25b6741caf0a" path="/var/lib/kubelet/pods/967c80f8-047c-4c0a-a81f-25b6741caf0a/volumes" Oct 06 07:03:18 crc kubenswrapper[4845]: I1006 07:03:18.243333 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36" path="/var/lib/kubelet/pods/a4e37ede-22f6-42b4-8f7d-8c6cafc9fe36/volumes" Oct 06 07:03:19 crc kubenswrapper[4845]: I1006 07:03:19.195471 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14928035-28fc-46c2-ade7-cc9f24cd0660","Type":"ContainerStarted","Data":"8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4"} Oct 06 07:03:19 crc kubenswrapper[4845]: I1006 07:03:19.218158 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.218143116 podStartE2EDuration="2.218143116s" podCreationTimestamp="2025-10-06 07:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:03:19.214712076 +0000 UTC m=+1083.729453084" watchObservedRunningTime="2025-10-06 07:03:19.218143116 +0000 UTC m=+1083.732884124" Oct 06 07:03:19 crc kubenswrapper[4845]: E1006 07:03:19.648157 4845 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57 is running failed: container process not found" containerID="01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 07:03:19 crc kubenswrapper[4845]: E1006 07:03:19.648984 4845 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57 is running failed: container process not found" containerID="01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 07:03:19 crc kubenswrapper[4845]: E1006 07:03:19.650081 4845 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57 is running failed: container process not found" containerID="01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 07:03:19 crc kubenswrapper[4845]: E1006 07:03:19.650109 4845 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3beeba2c-745b-472d-9a8b-152aed3c246b" containerName="nova-scheduler-scheduler" Oct 06 07:03:19 crc kubenswrapper[4845]: I1006 07:03:19.972062 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.082782 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3beeba2c-745b-472d-9a8b-152aed3c246b-combined-ca-bundle\") pod \"3beeba2c-745b-472d-9a8b-152aed3c246b\" (UID: \"3beeba2c-745b-472d-9a8b-152aed3c246b\") " Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.083164 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb6zz\" (UniqueName: \"kubernetes.io/projected/3beeba2c-745b-472d-9a8b-152aed3c246b-kube-api-access-bb6zz\") pod \"3beeba2c-745b-472d-9a8b-152aed3c246b\" (UID: \"3beeba2c-745b-472d-9a8b-152aed3c246b\") " Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.083209 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3beeba2c-745b-472d-9a8b-152aed3c246b-config-data\") pod \"3beeba2c-745b-472d-9a8b-152aed3c246b\" (UID: \"3beeba2c-745b-472d-9a8b-152aed3c246b\") " Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.091872 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3beeba2c-745b-472d-9a8b-152aed3c246b-kube-api-access-bb6zz" (OuterVolumeSpecName: "kube-api-access-bb6zz") pod "3beeba2c-745b-472d-9a8b-152aed3c246b" (UID: "3beeba2c-745b-472d-9a8b-152aed3c246b"). InnerVolumeSpecName "kube-api-access-bb6zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.120576 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3beeba2c-745b-472d-9a8b-152aed3c246b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3beeba2c-745b-472d-9a8b-152aed3c246b" (UID: "3beeba2c-745b-472d-9a8b-152aed3c246b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.139458 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3beeba2c-745b-472d-9a8b-152aed3c246b-config-data" (OuterVolumeSpecName: "config-data") pod "3beeba2c-745b-472d-9a8b-152aed3c246b" (UID: "3beeba2c-745b-472d-9a8b-152aed3c246b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.185868 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3beeba2c-745b-472d-9a8b-152aed3c246b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.185909 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb6zz\" (UniqueName: \"kubernetes.io/projected/3beeba2c-745b-472d-9a8b-152aed3c246b-kube-api-access-bb6zz\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.185921 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3beeba2c-745b-472d-9a8b-152aed3c246b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.209195 4845 generic.go:334] "Generic (PLEG): container finished" podID="3beeba2c-745b-472d-9a8b-152aed3c246b" containerID="01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57" exitCode=0 Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.210316 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.212362 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3beeba2c-745b-472d-9a8b-152aed3c246b","Type":"ContainerDied","Data":"01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57"} Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.212425 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3beeba2c-745b-472d-9a8b-152aed3c246b","Type":"ContainerDied","Data":"c102a419b11fff1a4880da278b4fb790116bc356d01b5927f36148ac4a2c937f"} Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.212442 4845 scope.go:117] "RemoveContainer" containerID="01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.246381 4845 scope.go:117] "RemoveContainer" containerID="01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57" Oct 06 07:03:20 crc kubenswrapper[4845]: E1006 07:03:20.246788 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57\": container with ID starting with 01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57 not found: ID does not exist" containerID="01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.246827 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57"} err="failed to get container status \"01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57\": rpc error: code = NotFound desc = could not find container \"01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57\": container with ID starting with 01b7603ff3ad703634ade0b1a776b7036b175a58750c53263090228dfdac3e57 not found: ID does not exist" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.247038 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.269925 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.278536 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:03:20 crc kubenswrapper[4845]: E1006 07:03:20.278956 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3beeba2c-745b-472d-9a8b-152aed3c246b" containerName="nova-scheduler-scheduler" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.278971 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3beeba2c-745b-472d-9a8b-152aed3c246b" containerName="nova-scheduler-scheduler" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.279172 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3beeba2c-745b-472d-9a8b-152aed3c246b" containerName="nova-scheduler-scheduler" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.279833 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.281930 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.297536 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.388908 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3445051-6b05-4a57-8766-4f8f066510e2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f3445051-6b05-4a57-8766-4f8f066510e2\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.389009 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3445051-6b05-4a57-8766-4f8f066510e2-config-data\") pod \"nova-scheduler-0\" (UID: \"f3445051-6b05-4a57-8766-4f8f066510e2\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.389314 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnglz\" (UniqueName: \"kubernetes.io/projected/f3445051-6b05-4a57-8766-4f8f066510e2-kube-api-access-cnglz\") pod \"nova-scheduler-0\" (UID: \"f3445051-6b05-4a57-8766-4f8f066510e2\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.491306 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3445051-6b05-4a57-8766-4f8f066510e2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f3445051-6b05-4a57-8766-4f8f066510e2\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.491361 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3445051-6b05-4a57-8766-4f8f066510e2-config-data\") pod \"nova-scheduler-0\" (UID: \"f3445051-6b05-4a57-8766-4f8f066510e2\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.491458 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnglz\" (UniqueName: \"kubernetes.io/projected/f3445051-6b05-4a57-8766-4f8f066510e2-kube-api-access-cnglz\") pod \"nova-scheduler-0\" (UID: \"f3445051-6b05-4a57-8766-4f8f066510e2\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.496020 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3445051-6b05-4a57-8766-4f8f066510e2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f3445051-6b05-4a57-8766-4f8f066510e2\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.496108 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3445051-6b05-4a57-8766-4f8f066510e2-config-data\") pod \"nova-scheduler-0\" (UID: \"f3445051-6b05-4a57-8766-4f8f066510e2\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.505924 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnglz\" (UniqueName: \"kubernetes.io/projected/f3445051-6b05-4a57-8766-4f8f066510e2-kube-api-access-cnglz\") pod \"nova-scheduler-0\" (UID: \"f3445051-6b05-4a57-8766-4f8f066510e2\") " pod="openstack/nova-scheduler-0" Oct 06 07:03:20 crc kubenswrapper[4845]: I1006 07:03:20.597188 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.041286 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.177351 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.255280 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f3445051-6b05-4a57-8766-4f8f066510e2","Type":"ContainerStarted","Data":"a5d20692cd55a1bff257cfb2486a188fcf8c96475ff83039d7fe53877f9abfe6"} Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.263258 4845 generic.go:334] "Generic (PLEG): container finished" podID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" containerID="9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb" exitCode=0 Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.263405 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7","Type":"ContainerDied","Data":"9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb"} Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.263437 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7","Type":"ContainerDied","Data":"05345383f77cb40f9f7423193b3ab97454365b2789ed3afaf516ca8e26f64714"} Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.263473 4845 scope.go:117] "RemoveContainer" containerID="9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.263716 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.315096 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-combined-ca-bundle\") pod \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.315347 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-config-data\") pod \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.315391 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-logs\") pod \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.315418 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbxc5\" (UniqueName: \"kubernetes.io/projected/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-kube-api-access-mbxc5\") pod \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\" (UID: \"6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7\") " Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.324893 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-logs" (OuterVolumeSpecName: "logs") pod "6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" (UID: "6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.326709 4845 scope.go:117] "RemoveContainer" containerID="3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.341736 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-kube-api-access-mbxc5" (OuterVolumeSpecName: "kube-api-access-mbxc5") pod "6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" (UID: "6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7"). InnerVolumeSpecName "kube-api-access-mbxc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.375251 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" (UID: "6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.377815 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-config-data" (OuterVolumeSpecName: "config-data") pod "6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" (UID: "6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.394661 4845 scope.go:117] "RemoveContainer" containerID="9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb" Oct 06 07:03:21 crc kubenswrapper[4845]: E1006 07:03:21.395091 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb\": container with ID starting with 9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb not found: ID does not exist" containerID="9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.395145 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb"} err="failed to get container status \"9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb\": rpc error: code = NotFound desc = could not find container \"9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb\": container with ID starting with 9813e57cb127ae523b7b406e91d13797ab42e89504b659c127121590d8f107fb not found: ID does not exist" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.395200 4845 scope.go:117] "RemoveContainer" containerID="3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db" Oct 06 07:03:21 crc kubenswrapper[4845]: E1006 07:03:21.395691 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db\": container with ID starting with 3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db not found: ID does not exist" containerID="3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.395775 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db"} err="failed to get container status \"3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db\": rpc error: code = NotFound desc = could not find container \"3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db\": container with ID starting with 3d120a0b4fe13879b4b6f51e798f5b542bef46492cc58da5cdcd3c253b6db6db not found: ID does not exist" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.416932 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.416960 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.416969 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbxc5\" (UniqueName: \"kubernetes.io/projected/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-kube-api-access-mbxc5\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.416980 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.611595 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.633283 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.639177 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:21 crc kubenswrapper[4845]: E1006 07:03:21.639998 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" containerName="nova-api-api" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.640160 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" containerName="nova-api-api" Oct 06 07:03:21 crc kubenswrapper[4845]: E1006 07:03:21.640359 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" containerName="nova-api-log" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.640525 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" containerName="nova-api-log" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.640964 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" containerName="nova-api-log" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.641119 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" containerName="nova-api-api" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.643010 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.646524 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.646700 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.724053 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-config-data\") pod \"nova-api-0\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.724296 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc2qm\" (UniqueName: \"kubernetes.io/projected/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-kube-api-access-rc2qm\") pod \"nova-api-0\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.724715 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-logs\") pod \"nova-api-0\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.724858 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.826625 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-logs\") pod \"nova-api-0\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.826686 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.826704 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-config-data\") pod \"nova-api-0\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.826785 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc2qm\" (UniqueName: \"kubernetes.io/projected/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-kube-api-access-rc2qm\") pod \"nova-api-0\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.827511 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-logs\") pod \"nova-api-0\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.831678 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.834807 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-config-data\") pod \"nova-api-0\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.842759 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc2qm\" (UniqueName: \"kubernetes.io/projected/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-kube-api-access-rc2qm\") pod \"nova-api-0\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " pod="openstack/nova-api-0" Oct 06 07:03:21 crc kubenswrapper[4845]: I1006 07:03:21.982590 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:22 crc kubenswrapper[4845]: I1006 07:03:22.188008 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 07:03:22 crc kubenswrapper[4845]: I1006 07:03:22.240362 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3beeba2c-745b-472d-9a8b-152aed3c246b" path="/var/lib/kubelet/pods/3beeba2c-745b-472d-9a8b-152aed3c246b/volumes" Oct 06 07:03:22 crc kubenswrapper[4845]: I1006 07:03:22.241481 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7" path="/var/lib/kubelet/pods/6c5ced93-ff6a-4684-8cf7-6cdfa29c7ca7/volumes" Oct 06 07:03:22 crc kubenswrapper[4845]: I1006 07:03:22.301609 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f3445051-6b05-4a57-8766-4f8f066510e2","Type":"ContainerStarted","Data":"581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5"} Oct 06 07:03:22 crc kubenswrapper[4845]: I1006 07:03:22.326293 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.326277283 podStartE2EDuration="2.326277283s" podCreationTimestamp="2025-10-06 07:03:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:03:22.318652932 +0000 UTC m=+1086.833393960" watchObservedRunningTime="2025-10-06 07:03:22.326277283 +0000 UTC m=+1086.841018291" Oct 06 07:03:22 crc kubenswrapper[4845]: I1006 07:03:22.415065 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:22 crc kubenswrapper[4845]: I1006 07:03:22.572266 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 07:03:22 crc kubenswrapper[4845]: I1006 07:03:22.572320 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 07:03:23 crc kubenswrapper[4845]: I1006 07:03:23.019140 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:03:23 crc kubenswrapper[4845]: I1006 07:03:23.019566 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:03:23 crc kubenswrapper[4845]: I1006 07:03:23.313254 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b4b64cd-a047-4723-9dfd-af1dc10bbecf","Type":"ContainerStarted","Data":"d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc"} Oct 06 07:03:23 crc kubenswrapper[4845]: I1006 07:03:23.313293 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b4b64cd-a047-4723-9dfd-af1dc10bbecf","Type":"ContainerStarted","Data":"d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6"} Oct 06 07:03:23 crc kubenswrapper[4845]: I1006 07:03:23.313303 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b4b64cd-a047-4723-9dfd-af1dc10bbecf","Type":"ContainerStarted","Data":"825e4781de952a275be3b3d688dcdd38ed7d8687eff7c349e09c7dde7a65de39"} Oct 06 07:03:23 crc kubenswrapper[4845]: I1006 07:03:23.346839 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.346817536 podStartE2EDuration="2.346817536s" podCreationTimestamp="2025-10-06 07:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:03:23.332388245 +0000 UTC m=+1087.847129263" watchObservedRunningTime="2025-10-06 07:03:23.346817536 +0000 UTC m=+1087.861558554" Oct 06 07:03:25 crc kubenswrapper[4845]: I1006 07:03:25.559867 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 06 07:03:25 crc kubenswrapper[4845]: I1006 07:03:25.583086 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 07:03:25 crc kubenswrapper[4845]: I1006 07:03:25.583415 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="13f70ac7-bc2e-4cfe-a094-f78ec31b3879" containerName="kube-state-metrics" containerID="cri-o://82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4" gracePeriod=30 Oct 06 07:03:25 crc kubenswrapper[4845]: I1006 07:03:25.597830 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.087132 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.215691 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjhvp\" (UniqueName: \"kubernetes.io/projected/13f70ac7-bc2e-4cfe-a094-f78ec31b3879-kube-api-access-cjhvp\") pod \"13f70ac7-bc2e-4cfe-a094-f78ec31b3879\" (UID: \"13f70ac7-bc2e-4cfe-a094-f78ec31b3879\") " Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.220794 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f70ac7-bc2e-4cfe-a094-f78ec31b3879-kube-api-access-cjhvp" (OuterVolumeSpecName: "kube-api-access-cjhvp") pod "13f70ac7-bc2e-4cfe-a094-f78ec31b3879" (UID: "13f70ac7-bc2e-4cfe-a094-f78ec31b3879"). InnerVolumeSpecName "kube-api-access-cjhvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.318495 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjhvp\" (UniqueName: \"kubernetes.io/projected/13f70ac7-bc2e-4cfe-a094-f78ec31b3879-kube-api-access-cjhvp\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.340840 4845 generic.go:334] "Generic (PLEG): container finished" podID="13f70ac7-bc2e-4cfe-a094-f78ec31b3879" containerID="82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4" exitCode=2 Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.340891 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"13f70ac7-bc2e-4cfe-a094-f78ec31b3879","Type":"ContainerDied","Data":"82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4"} Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.340900 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.340929 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"13f70ac7-bc2e-4cfe-a094-f78ec31b3879","Type":"ContainerDied","Data":"978d205a1677d0f0110cc95cdd27ecb90389f2c95b3b36bc29c7ee371e49efd0"} Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.340962 4845 scope.go:117] "RemoveContainer" containerID="82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.362582 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.363830 4845 scope.go:117] "RemoveContainer" containerID="82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4" Oct 06 07:03:26 crc kubenswrapper[4845]: E1006 07:03:26.364193 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4\": container with ID starting with 82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4 not found: ID does not exist" containerID="82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.364235 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4"} err="failed to get container status \"82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4\": rpc error: code = NotFound desc = could not find container \"82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4\": container with ID starting with 82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4 not found: ID does not exist" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.372182 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.385833 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 07:03:26 crc kubenswrapper[4845]: E1006 07:03:26.386362 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f70ac7-bc2e-4cfe-a094-f78ec31b3879" containerName="kube-state-metrics" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.386399 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f70ac7-bc2e-4cfe-a094-f78ec31b3879" containerName="kube-state-metrics" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.386664 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f70ac7-bc2e-4cfe-a094-f78ec31b3879" containerName="kube-state-metrics" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.387528 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.390049 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.390125 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.403188 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.522059 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dfda52e-f351-49b0-93b6-e95ce8146051-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0dfda52e-f351-49b0-93b6-e95ce8146051\") " pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.522312 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh8r8\" (UniqueName: \"kubernetes.io/projected/0dfda52e-f351-49b0-93b6-e95ce8146051-kube-api-access-sh8r8\") pod \"kube-state-metrics-0\" (UID: \"0dfda52e-f351-49b0-93b6-e95ce8146051\") " pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.522618 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0dfda52e-f351-49b0-93b6-e95ce8146051-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0dfda52e-f351-49b0-93b6-e95ce8146051\") " pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.522674 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dfda52e-f351-49b0-93b6-e95ce8146051-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0dfda52e-f351-49b0-93b6-e95ce8146051\") " pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.625052 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0dfda52e-f351-49b0-93b6-e95ce8146051-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0dfda52e-f351-49b0-93b6-e95ce8146051\") " pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.625091 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dfda52e-f351-49b0-93b6-e95ce8146051-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0dfda52e-f351-49b0-93b6-e95ce8146051\") " pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.625171 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dfda52e-f351-49b0-93b6-e95ce8146051-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0dfda52e-f351-49b0-93b6-e95ce8146051\") " pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.625218 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh8r8\" (UniqueName: \"kubernetes.io/projected/0dfda52e-f351-49b0-93b6-e95ce8146051-kube-api-access-sh8r8\") pod \"kube-state-metrics-0\" (UID: \"0dfda52e-f351-49b0-93b6-e95ce8146051\") " pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.628756 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0dfda52e-f351-49b0-93b6-e95ce8146051-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0dfda52e-f351-49b0-93b6-e95ce8146051\") " pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.628826 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dfda52e-f351-49b0-93b6-e95ce8146051-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0dfda52e-f351-49b0-93b6-e95ce8146051\") " pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.628951 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dfda52e-f351-49b0-93b6-e95ce8146051-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0dfda52e-f351-49b0-93b6-e95ce8146051\") " pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.650574 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh8r8\" (UniqueName: \"kubernetes.io/projected/0dfda52e-f351-49b0-93b6-e95ce8146051-kube-api-access-sh8r8\") pod \"kube-state-metrics-0\" (UID: \"0dfda52e-f351-49b0-93b6-e95ce8146051\") " pod="openstack/kube-state-metrics-0" Oct 06 07:03:26 crc kubenswrapper[4845]: I1006 07:03:26.711958 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 07:03:27 crc kubenswrapper[4845]: I1006 07:03:27.128223 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 07:03:27 crc kubenswrapper[4845]: I1006 07:03:27.155753 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:03:27 crc kubenswrapper[4845]: I1006 07:03:27.156022 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="ceilometer-central-agent" containerID="cri-o://63c80eddb23dbda3d5ed0d26250eb5ce6f1eddbc0aec31719d0d651951454aeb" gracePeriod=30 Oct 06 07:03:27 crc kubenswrapper[4845]: I1006 07:03:27.156127 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="proxy-httpd" containerID="cri-o://7578f2c015dc605af12885d3129d99a7d626c0827944e79bff977c3fb32fd2e9" gracePeriod=30 Oct 06 07:03:27 crc kubenswrapper[4845]: I1006 07:03:27.156174 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="ceilometer-notification-agent" containerID="cri-o://e272e8a9a1ce4260b16abab94201745db8528ed805fa227d0441aed4998bcfdd" gracePeriod=30 Oct 06 07:03:27 crc kubenswrapper[4845]: I1006 07:03:27.156140 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="sg-core" containerID="cri-o://6ed8a6cf4c9f639d31c13004eb3e0927b3b839f9d3c44c0beda079768726870a" gracePeriod=30 Oct 06 07:03:27 crc kubenswrapper[4845]: I1006 07:03:27.350337 4845 generic.go:334] "Generic (PLEG): container finished" podID="297dd975-5214-490d-a596-42722d59c5a3" containerID="6ed8a6cf4c9f639d31c13004eb3e0927b3b839f9d3c44c0beda079768726870a" exitCode=2 Oct 06 07:03:27 crc kubenswrapper[4845]: I1006 07:03:27.350481 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"297dd975-5214-490d-a596-42722d59c5a3","Type":"ContainerDied","Data":"6ed8a6cf4c9f639d31c13004eb3e0927b3b839f9d3c44c0beda079768726870a"} Oct 06 07:03:27 crc kubenswrapper[4845]: I1006 07:03:27.352318 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0dfda52e-f351-49b0-93b6-e95ce8146051","Type":"ContainerStarted","Data":"159488c12b89a12ee6f834bf2dafe9a3393dcfc715c031ac44a957411624bb73"} Oct 06 07:03:27 crc kubenswrapper[4845]: I1006 07:03:27.571949 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 07:03:27 crc kubenswrapper[4845]: I1006 07:03:27.572017 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 07:03:28 crc kubenswrapper[4845]: I1006 07:03:28.247204 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13f70ac7-bc2e-4cfe-a094-f78ec31b3879" path="/var/lib/kubelet/pods/13f70ac7-bc2e-4cfe-a094-f78ec31b3879/volumes" Oct 06 07:03:28 crc kubenswrapper[4845]: I1006 07:03:28.362198 4845 generic.go:334] "Generic (PLEG): container finished" podID="297dd975-5214-490d-a596-42722d59c5a3" containerID="7578f2c015dc605af12885d3129d99a7d626c0827944e79bff977c3fb32fd2e9" exitCode=0 Oct 06 07:03:28 crc kubenswrapper[4845]: I1006 07:03:28.362238 4845 generic.go:334] "Generic (PLEG): container finished" podID="297dd975-5214-490d-a596-42722d59c5a3" containerID="63c80eddb23dbda3d5ed0d26250eb5ce6f1eddbc0aec31719d0d651951454aeb" exitCode=0 Oct 06 07:03:28 crc kubenswrapper[4845]: I1006 07:03:28.362260 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"297dd975-5214-490d-a596-42722d59c5a3","Type":"ContainerDied","Data":"7578f2c015dc605af12885d3129d99a7d626c0827944e79bff977c3fb32fd2e9"} Oct 06 07:03:28 crc kubenswrapper[4845]: I1006 07:03:28.362296 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"297dd975-5214-490d-a596-42722d59c5a3","Type":"ContainerDied","Data":"63c80eddb23dbda3d5ed0d26250eb5ce6f1eddbc0aec31719d0d651951454aeb"} Oct 06 07:03:28 crc kubenswrapper[4845]: I1006 07:03:28.363416 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0dfda52e-f351-49b0-93b6-e95ce8146051","Type":"ContainerStarted","Data":"a3e2414237c494104031ca081a42c18a4228dcf92041e959291f2632f54655dd"} Oct 06 07:03:28 crc kubenswrapper[4845]: I1006 07:03:28.363574 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 07:03:28 crc kubenswrapper[4845]: I1006 07:03:28.435267 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.048367043 podStartE2EDuration="2.435251646s" podCreationTimestamp="2025-10-06 07:03:26 +0000 UTC" firstStartedPulling="2025-10-06 07:03:27.138267142 +0000 UTC m=+1091.653008150" lastFinishedPulling="2025-10-06 07:03:27.525151745 +0000 UTC m=+1092.039892753" observedRunningTime="2025-10-06 07:03:28.432892874 +0000 UTC m=+1092.947633882" watchObservedRunningTime="2025-10-06 07:03:28.435251646 +0000 UTC m=+1092.949992654" Oct 06 07:03:28 crc kubenswrapper[4845]: I1006 07:03:28.593569 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 07:03:28 crc kubenswrapper[4845]: I1006 07:03:28.593569 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.390297 4845 generic.go:334] "Generic (PLEG): container finished" podID="297dd975-5214-490d-a596-42722d59c5a3" containerID="e272e8a9a1ce4260b16abab94201745db8528ed805fa227d0441aed4998bcfdd" exitCode=0 Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.390366 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"297dd975-5214-490d-a596-42722d59c5a3","Type":"ContainerDied","Data":"e272e8a9a1ce4260b16abab94201745db8528ed805fa227d0441aed4998bcfdd"} Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.390717 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"297dd975-5214-490d-a596-42722d59c5a3","Type":"ContainerDied","Data":"17b70b14dd204739d80a45e21465e7ac35c17c6036e8460958084c69aeb183b1"} Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.390746 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17b70b14dd204739d80a45e21465e7ac35c17c6036e8460958084c69aeb183b1" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.458571 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.585527 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-scripts\") pod \"297dd975-5214-490d-a596-42722d59c5a3\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.585648 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-combined-ca-bundle\") pod \"297dd975-5214-490d-a596-42722d59c5a3\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.585722 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-sg-core-conf-yaml\") pod \"297dd975-5214-490d-a596-42722d59c5a3\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.585754 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-config-data\") pod \"297dd975-5214-490d-a596-42722d59c5a3\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.585782 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/297dd975-5214-490d-a596-42722d59c5a3-run-httpd\") pod \"297dd975-5214-490d-a596-42722d59c5a3\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.585881 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/297dd975-5214-490d-a596-42722d59c5a3-log-httpd\") pod \"297dd975-5214-490d-a596-42722d59c5a3\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.585923 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbxpt\" (UniqueName: \"kubernetes.io/projected/297dd975-5214-490d-a596-42722d59c5a3-kube-api-access-cbxpt\") pod \"297dd975-5214-490d-a596-42722d59c5a3\" (UID: \"297dd975-5214-490d-a596-42722d59c5a3\") " Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.586317 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/297dd975-5214-490d-a596-42722d59c5a3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "297dd975-5214-490d-a596-42722d59c5a3" (UID: "297dd975-5214-490d-a596-42722d59c5a3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.586731 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/297dd975-5214-490d-a596-42722d59c5a3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "297dd975-5214-490d-a596-42722d59c5a3" (UID: "297dd975-5214-490d-a596-42722d59c5a3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.592596 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/297dd975-5214-490d-a596-42722d59c5a3-kube-api-access-cbxpt" (OuterVolumeSpecName: "kube-api-access-cbxpt") pod "297dd975-5214-490d-a596-42722d59c5a3" (UID: "297dd975-5214-490d-a596-42722d59c5a3"). InnerVolumeSpecName "kube-api-access-cbxpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.594518 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-scripts" (OuterVolumeSpecName: "scripts") pod "297dd975-5214-490d-a596-42722d59c5a3" (UID: "297dd975-5214-490d-a596-42722d59c5a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.614549 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "297dd975-5214-490d-a596-42722d59c5a3" (UID: "297dd975-5214-490d-a596-42722d59c5a3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.659137 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "297dd975-5214-490d-a596-42722d59c5a3" (UID: "297dd975-5214-490d-a596-42722d59c5a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.679927 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-config-data" (OuterVolumeSpecName: "config-data") pod "297dd975-5214-490d-a596-42722d59c5a3" (UID: "297dd975-5214-490d-a596-42722d59c5a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.688072 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.688098 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.688107 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/297dd975-5214-490d-a596-42722d59c5a3-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.688115 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/297dd975-5214-490d-a596-42722d59c5a3-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.688123 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbxpt\" (UniqueName: \"kubernetes.io/projected/297dd975-5214-490d-a596-42722d59c5a3-kube-api-access-cbxpt\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.688133 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:29 crc kubenswrapper[4845]: I1006 07:03:29.688141 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/297dd975-5214-490d-a596-42722d59c5a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.397089 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.419991 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.428021 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.444799 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:03:30 crc kubenswrapper[4845]: E1006 07:03:30.445285 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="ceilometer-notification-agent" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.445311 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="ceilometer-notification-agent" Oct 06 07:03:30 crc kubenswrapper[4845]: E1006 07:03:30.445340 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="proxy-httpd" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.445349 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="proxy-httpd" Oct 06 07:03:30 crc kubenswrapper[4845]: E1006 07:03:30.445415 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="sg-core" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.445429 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="sg-core" Oct 06 07:03:30 crc kubenswrapper[4845]: E1006 07:03:30.445448 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="ceilometer-central-agent" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.445457 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="ceilometer-central-agent" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.445680 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="proxy-httpd" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.445700 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="ceilometer-central-agent" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.445725 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="sg-core" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.445735 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="297dd975-5214-490d-a596-42722d59c5a3" containerName="ceilometer-notification-agent" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.447929 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.450268 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.450503 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.450875 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.454694 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.502130 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-config-data\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.502182 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.502267 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.502292 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkdrl\" (UniqueName: \"kubernetes.io/projected/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-kube-api-access-kkdrl\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.502456 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-scripts\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.502530 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-log-httpd\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.502574 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.502710 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-run-httpd\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.598321 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.603964 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.604007 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkdrl\" (UniqueName: \"kubernetes.io/projected/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-kube-api-access-kkdrl\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.604098 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-scripts\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.604125 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-log-httpd\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.604145 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.604202 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-run-httpd\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.604244 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-config-data\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.604267 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.605160 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-log-httpd\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.605195 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-run-httpd\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.611010 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.611112 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-scripts\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.611211 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-config-data\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.612600 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.615769 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.621931 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkdrl\" (UniqueName: \"kubernetes.io/projected/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-kube-api-access-kkdrl\") pod \"ceilometer-0\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " pod="openstack/ceilometer-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.626554 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 07:03:30 crc kubenswrapper[4845]: I1006 07:03:30.775337 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:03:31 crc kubenswrapper[4845]: I1006 07:03:31.224951 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:03:31 crc kubenswrapper[4845]: I1006 07:03:31.406545 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2","Type":"ContainerStarted","Data":"51c1f9d0284b90dd497cba3a9807962d595ad67fb2168dc0f4253f7de7b7a87b"} Oct 06 07:03:31 crc kubenswrapper[4845]: I1006 07:03:31.447477 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 07:03:31 crc kubenswrapper[4845]: I1006 07:03:31.985154 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 07:03:31 crc kubenswrapper[4845]: I1006 07:03:31.985526 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 07:03:32 crc kubenswrapper[4845]: I1006 07:03:32.240757 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="297dd975-5214-490d-a596-42722d59c5a3" path="/var/lib/kubelet/pods/297dd975-5214-490d-a596-42722d59c5a3/volumes" Oct 06 07:03:32 crc kubenswrapper[4845]: I1006 07:03:32.417866 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2","Type":"ContainerStarted","Data":"69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61"} Oct 06 07:03:32 crc kubenswrapper[4845]: I1006 07:03:32.417905 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2","Type":"ContainerStarted","Data":"d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025"} Oct 06 07:03:33 crc kubenswrapper[4845]: I1006 07:03:33.067639 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 07:03:33 crc kubenswrapper[4845]: I1006 07:03:33.067659 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 07:03:33 crc kubenswrapper[4845]: I1006 07:03:33.428931 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2","Type":"ContainerStarted","Data":"a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a"} Oct 06 07:03:34 crc kubenswrapper[4845]: I1006 07:03:34.440235 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2","Type":"ContainerStarted","Data":"6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d"} Oct 06 07:03:34 crc kubenswrapper[4845]: I1006 07:03:34.440960 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 07:03:34 crc kubenswrapper[4845]: I1006 07:03:34.464643 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.57287181 podStartE2EDuration="4.464627401s" podCreationTimestamp="2025-10-06 07:03:30 +0000 UTC" firstStartedPulling="2025-10-06 07:03:31.228967871 +0000 UTC m=+1095.743708889" lastFinishedPulling="2025-10-06 07:03:34.120723462 +0000 UTC m=+1098.635464480" observedRunningTime="2025-10-06 07:03:34.46116182 +0000 UTC m=+1098.975902848" watchObservedRunningTime="2025-10-06 07:03:34.464627401 +0000 UTC m=+1098.979368409" Oct 06 07:03:36 crc kubenswrapper[4845]: I1006 07:03:36.738127 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 07:03:37 crc kubenswrapper[4845]: I1006 07:03:37.578905 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 07:03:37 crc kubenswrapper[4845]: I1006 07:03:37.581053 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 07:03:37 crc kubenswrapper[4845]: I1006 07:03:37.583477 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 07:03:38 crc kubenswrapper[4845]: I1006 07:03:38.485875 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 07:03:39 crc kubenswrapper[4845]: E1006 07:03:39.067028 4845 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3beeba2c_745b_472d_9a8b_152aed3c246b.slice/crio-c102a419b11fff1a4880da278b4fb790116bc356d01b5927f36148ac4a2c937f: Error finding container c102a419b11fff1a4880da278b4fb790116bc356d01b5927f36148ac4a2c937f: Status 404 returned error can't find the container with id c102a419b11fff1a4880da278b4fb790116bc356d01b5927f36148ac4a2c937f Oct 06 07:03:39 crc kubenswrapper[4845]: E1006 07:03:39.070352 4845 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4e37ede_22f6_42b4_8f7d_8c6cafc9fe36.slice/crio-b31c6e019be117146ec796b35018bc4a25e30ec8b3db06b71f4d2178e854fa87: Error finding container b31c6e019be117146ec796b35018bc4a25e30ec8b3db06b71f4d2178e854fa87: Status 404 returned error can't find the container with id b31c6e019be117146ec796b35018bc4a25e30ec8b3db06b71f4d2178e854fa87 Oct 06 07:03:39 crc kubenswrapper[4845]: E1006 07:03:39.356024 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod297dd975_5214_490d_a596_42722d59c5a3.slice/crio-conmon-7578f2c015dc605af12885d3129d99a7d626c0827944e79bff977c3fb32fd2e9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod297dd975_5214_490d_a596_42722d59c5a3.slice/crio-17b70b14dd204739d80a45e21465e7ac35c17c6036e8460958084c69aeb183b1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod297dd975_5214_490d_a596_42722d59c5a3.slice/crio-conmon-6ed8a6cf4c9f639d31c13004eb3e0927b3b839f9d3c44c0beda079768726870a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod297dd975_5214_490d_a596_42722d59c5a3.slice/crio-7578f2c015dc605af12885d3129d99a7d626c0827944e79bff977c3fb32fd2e9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod297dd975_5214_490d_a596_42722d59c5a3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13f70ac7_bc2e_4cfe_a094_f78ec31b3879.slice/crio-978d205a1677d0f0110cc95cdd27ecb90389f2c95b3b36bc29c7ee371e49efd0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13f70ac7_bc2e_4cfe_a094_f78ec31b3879.slice/crio-82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod297dd975_5214_490d_a596_42722d59c5a3.slice/crio-6ed8a6cf4c9f639d31c13004eb3e0927b3b839f9d3c44c0beda079768726870a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod297dd975_5214_490d_a596_42722d59c5a3.slice/crio-e272e8a9a1ce4260b16abab94201745db8528ed805fa227d0441aed4998bcfdd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d99c844_d63a_420b_a123_2f549734e048.slice/crio-132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod297dd975_5214_490d_a596_42722d59c5a3.slice/crio-conmon-e272e8a9a1ce4260b16abab94201745db8528ed805fa227d0441aed4998bcfdd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13f70ac7_bc2e_4cfe_a094_f78ec31b3879.slice/crio-conmon-82e094cc169c03e8a608336ecaee70b6d78617ca0f2f6d11927320ffbf09c7a4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13f70ac7_bc2e_4cfe_a094_f78ec31b3879.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod297dd975_5214_490d_a596_42722d59c5a3.slice/crio-63c80eddb23dbda3d5ed0d26250eb5ce6f1eddbc0aec31719d0d651951454aeb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d99c844_d63a_420b_a123_2f549734e048.slice/crio-conmon-132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod297dd975_5214_490d_a596_42722d59c5a3.slice/crio-conmon-63c80eddb23dbda3d5ed0d26250eb5ce6f1eddbc0aec31719d0d651951454aeb.scope\": RecentStats: unable to find data in memory cache]" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.442815 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.493353 4845 generic.go:334] "Generic (PLEG): container finished" podID="4d99c844-d63a-420b-a123-2f549734e048" containerID="132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da" exitCode=137 Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.494452 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.495055 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4d99c844-d63a-420b-a123-2f549734e048","Type":"ContainerDied","Data":"132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da"} Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.495089 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4d99c844-d63a-420b-a123-2f549734e048","Type":"ContainerDied","Data":"5ea1d61a72e7b78d6c12f757260e6d4b972c3d714902d9db2980582edf37ba47"} Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.495127 4845 scope.go:117] "RemoveContainer" containerID="132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.521686 4845 scope.go:117] "RemoveContainer" containerID="132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da" Oct 06 07:03:39 crc kubenswrapper[4845]: E1006 07:03:39.522227 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da\": container with ID starting with 132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da not found: ID does not exist" containerID="132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.522271 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da"} err="failed to get container status \"132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da\": rpc error: code = NotFound desc = could not find container \"132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da\": container with ID starting with 132dea06ce0c292cbce3471aeb4e26b414844fc39413f9d43ce9ffce9a6057da not found: ID does not exist" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.562248 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d99c844-d63a-420b-a123-2f549734e048-config-data\") pod \"4d99c844-d63a-420b-a123-2f549734e048\" (UID: \"4d99c844-d63a-420b-a123-2f549734e048\") " Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.562365 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k75j4\" (UniqueName: \"kubernetes.io/projected/4d99c844-d63a-420b-a123-2f549734e048-kube-api-access-k75j4\") pod \"4d99c844-d63a-420b-a123-2f549734e048\" (UID: \"4d99c844-d63a-420b-a123-2f549734e048\") " Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.562633 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d99c844-d63a-420b-a123-2f549734e048-combined-ca-bundle\") pod \"4d99c844-d63a-420b-a123-2f549734e048\" (UID: \"4d99c844-d63a-420b-a123-2f549734e048\") " Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.567975 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d99c844-d63a-420b-a123-2f549734e048-kube-api-access-k75j4" (OuterVolumeSpecName: "kube-api-access-k75j4") pod "4d99c844-d63a-420b-a123-2f549734e048" (UID: "4d99c844-d63a-420b-a123-2f549734e048"). InnerVolumeSpecName "kube-api-access-k75j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.588217 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d99c844-d63a-420b-a123-2f549734e048-config-data" (OuterVolumeSpecName: "config-data") pod "4d99c844-d63a-420b-a123-2f549734e048" (UID: "4d99c844-d63a-420b-a123-2f549734e048"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.590236 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d99c844-d63a-420b-a123-2f549734e048-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d99c844-d63a-420b-a123-2f549734e048" (UID: "4d99c844-d63a-420b-a123-2f549734e048"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.665283 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d99c844-d63a-420b-a123-2f549734e048-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.665351 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d99c844-d63a-420b-a123-2f549734e048-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.665367 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k75j4\" (UniqueName: \"kubernetes.io/projected/4d99c844-d63a-420b-a123-2f549734e048-kube-api-access-k75j4\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.826976 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.837994 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.848139 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 07:03:39 crc kubenswrapper[4845]: E1006 07:03:39.849098 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d99c844-d63a-420b-a123-2f549734e048" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.849126 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d99c844-d63a-420b-a123-2f549734e048" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.849359 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d99c844-d63a-420b-a123-2f549734e048" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.850092 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.852476 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.852514 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.852679 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.858007 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.972193 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.972284 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.972460 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp824\" (UniqueName: \"kubernetes.io/projected/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-kube-api-access-tp824\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.972845 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:39 crc kubenswrapper[4845]: I1006 07:03:39.972873 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.074390 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.074466 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp824\" (UniqueName: \"kubernetes.io/projected/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-kube-api-access-tp824\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.074590 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.074616 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.074691 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.081166 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.081195 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.081211 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.081410 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.089225 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp824\" (UniqueName: \"kubernetes.io/projected/c326f85b-5b04-4ff0-a0e4-29a1e11eefb2-kube-api-access-tp824\") pod \"nova-cell1-novncproxy-0\" (UID: \"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.170356 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.241615 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d99c844-d63a-420b-a123-2f549734e048" path="/var/lib/kubelet/pods/4d99c844-d63a-420b-a123-2f549734e048/volumes" Oct 06 07:03:40 crc kubenswrapper[4845]: I1006 07:03:40.592675 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 07:03:40 crc kubenswrapper[4845]: W1006 07:03:40.593802 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc326f85b_5b04_4ff0_a0e4_29a1e11eefb2.slice/crio-fb273cc6f1bb8d0a3a27aa7243301d7d1030809648988813e02fc2f2ff06e224 WatchSource:0}: Error finding container fb273cc6f1bb8d0a3a27aa7243301d7d1030809648988813e02fc2f2ff06e224: Status 404 returned error can't find the container with id fb273cc6f1bb8d0a3a27aa7243301d7d1030809648988813e02fc2f2ff06e224 Oct 06 07:03:41 crc kubenswrapper[4845]: I1006 07:03:41.517268 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2","Type":"ContainerStarted","Data":"12399756023d5e8e98a7548c3af192f3d21ea37ab0ec19a2b521d9f4d25b2ec8"} Oct 06 07:03:41 crc kubenswrapper[4845]: I1006 07:03:41.517670 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c326f85b-5b04-4ff0-a0e4-29a1e11eefb2","Type":"ContainerStarted","Data":"fb273cc6f1bb8d0a3a27aa7243301d7d1030809648988813e02fc2f2ff06e224"} Oct 06 07:03:41 crc kubenswrapper[4845]: I1006 07:03:41.557200 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.557171923 podStartE2EDuration="2.557171923s" podCreationTimestamp="2025-10-06 07:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:03:41.54223362 +0000 UTC m=+1106.056974688" watchObservedRunningTime="2025-10-06 07:03:41.557171923 +0000 UTC m=+1106.071912971" Oct 06 07:03:41 crc kubenswrapper[4845]: I1006 07:03:41.986550 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 07:03:41 crc kubenswrapper[4845]: I1006 07:03:41.988111 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 07:03:41 crc kubenswrapper[4845]: I1006 07:03:41.988342 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 07:03:41 crc kubenswrapper[4845]: I1006 07:03:41.990898 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.526051 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.529027 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.693590 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fff6d6bd5-92lvg"] Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.695688 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.717633 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fff6d6bd5-92lvg"] Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.824970 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-dns-swift-storage-0\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.825011 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-dns-svc\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.825033 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-ovsdbserver-nb\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.825132 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgdrq\" (UniqueName: \"kubernetes.io/projected/b1c731f5-0d76-4221-9d87-67d746bbc8e6-kube-api-access-dgdrq\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.825367 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-ovsdbserver-sb\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.825440 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-config\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.927706 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-ovsdbserver-sb\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.927768 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-config\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.927856 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-dns-swift-storage-0\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.927875 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-dns-svc\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.927894 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-ovsdbserver-nb\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.927926 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgdrq\" (UniqueName: \"kubernetes.io/projected/b1c731f5-0d76-4221-9d87-67d746bbc8e6-kube-api-access-dgdrq\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.928966 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-ovsdbserver-sb\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.929178 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-dns-svc\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.929179 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-config\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.929210 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-dns-swift-storage-0\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.929677 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-ovsdbserver-nb\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:42 crc kubenswrapper[4845]: I1006 07:03:42.946065 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgdrq\" (UniqueName: \"kubernetes.io/projected/b1c731f5-0d76-4221-9d87-67d746bbc8e6-kube-api-access-dgdrq\") pod \"dnsmasq-dns-6fff6d6bd5-92lvg\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:43 crc kubenswrapper[4845]: I1006 07:03:43.029466 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:43 crc kubenswrapper[4845]: I1006 07:03:43.483817 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fff6d6bd5-92lvg"] Oct 06 07:03:43 crc kubenswrapper[4845]: W1006 07:03:43.485561 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1c731f5_0d76_4221_9d87_67d746bbc8e6.slice/crio-1ec94e9c5ab2662a86d942d8a59712ee39cc5a28e351bfc2e0012691b16651f6 WatchSource:0}: Error finding container 1ec94e9c5ab2662a86d942d8a59712ee39cc5a28e351bfc2e0012691b16651f6: Status 404 returned error can't find the container with id 1ec94e9c5ab2662a86d942d8a59712ee39cc5a28e351bfc2e0012691b16651f6 Oct 06 07:03:43 crc kubenswrapper[4845]: I1006 07:03:43.534353 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" event={"ID":"b1c731f5-0d76-4221-9d87-67d746bbc8e6","Type":"ContainerStarted","Data":"1ec94e9c5ab2662a86d942d8a59712ee39cc5a28e351bfc2e0012691b16651f6"} Oct 06 07:03:44 crc kubenswrapper[4845]: I1006 07:03:44.469307 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:03:44 crc kubenswrapper[4845]: I1006 07:03:44.470036 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="ceilometer-central-agent" containerID="cri-o://d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025" gracePeriod=30 Oct 06 07:03:44 crc kubenswrapper[4845]: I1006 07:03:44.470107 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="sg-core" containerID="cri-o://a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a" gracePeriod=30 Oct 06 07:03:44 crc kubenswrapper[4845]: I1006 07:03:44.470170 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="ceilometer-notification-agent" containerID="cri-o://69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61" gracePeriod=30 Oct 06 07:03:44 crc kubenswrapper[4845]: I1006 07:03:44.470256 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="proxy-httpd" containerID="cri-o://6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d" gracePeriod=30 Oct 06 07:03:44 crc kubenswrapper[4845]: I1006 07:03:44.480728 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.196:3000/\": EOF" Oct 06 07:03:44 crc kubenswrapper[4845]: I1006 07:03:44.543273 4845 generic.go:334] "Generic (PLEG): container finished" podID="b1c731f5-0d76-4221-9d87-67d746bbc8e6" containerID="82eb10f2cbbd53166bc7f6c9a3884fa981c94ccd8c3bec1411504d173de1f48c" exitCode=0 Oct 06 07:03:44 crc kubenswrapper[4845]: I1006 07:03:44.543389 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" event={"ID":"b1c731f5-0d76-4221-9d87-67d746bbc8e6","Type":"ContainerDied","Data":"82eb10f2cbbd53166bc7f6c9a3884fa981c94ccd8c3bec1411504d173de1f48c"} Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.039121 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.172007 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.560824 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" event={"ID":"b1c731f5-0d76-4221-9d87-67d746bbc8e6","Type":"ContainerStarted","Data":"b99dfb34d23ee4ad9d1c9c2b078c301abfc5afd758206014ab557f6cbc86d7bd"} Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.561174 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.567778 4845 generic.go:334] "Generic (PLEG): container finished" podID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerID="6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d" exitCode=0 Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.567803 4845 generic.go:334] "Generic (PLEG): container finished" podID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerID="a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a" exitCode=2 Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.567813 4845 generic.go:334] "Generic (PLEG): container finished" podID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerID="d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025" exitCode=0 Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.567911 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2","Type":"ContainerDied","Data":"6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d"} Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.567970 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2","Type":"ContainerDied","Data":"a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a"} Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.567971 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" containerName="nova-api-log" containerID="cri-o://d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6" gracePeriod=30 Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.567984 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2","Type":"ContainerDied","Data":"d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025"} Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.568064 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" containerName="nova-api-api" containerID="cri-o://d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc" gracePeriod=30 Oct 06 07:03:45 crc kubenswrapper[4845]: I1006 07:03:45.582586 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" podStartSLOduration=3.58257022 podStartE2EDuration="3.58257022s" podCreationTimestamp="2025-10-06 07:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:03:45.577819685 +0000 UTC m=+1110.092560703" watchObservedRunningTime="2025-10-06 07:03:45.58257022 +0000 UTC m=+1110.097311228" Oct 06 07:03:46 crc kubenswrapper[4845]: I1006 07:03:46.578065 4845 generic.go:334] "Generic (PLEG): container finished" podID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" containerID="d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6" exitCode=143 Oct 06 07:03:46 crc kubenswrapper[4845]: I1006 07:03:46.578156 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b4b64cd-a047-4723-9dfd-af1dc10bbecf","Type":"ContainerDied","Data":"d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6"} Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.139159 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.231155 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-run-httpd\") pod \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.231607 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-ceilometer-tls-certs\") pod \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.231674 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" (UID: "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.231891 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-config-data\") pod \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.232017 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-combined-ca-bundle\") pod \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.232162 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-log-httpd\") pod \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.232326 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkdrl\" (UniqueName: \"kubernetes.io/projected/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-kube-api-access-kkdrl\") pod \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.232675 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-sg-core-conf-yaml\") pod \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.232794 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" (UID: "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.232979 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-scripts\") pod \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\" (UID: \"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2\") " Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.233740 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.233887 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.237008 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-kube-api-access-kkdrl" (OuterVolumeSpecName: "kube-api-access-kkdrl") pod "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" (UID: "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2"). InnerVolumeSpecName "kube-api-access-kkdrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.240088 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-scripts" (OuterVolumeSpecName: "scripts") pod "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" (UID: "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.280303 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" (UID: "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.313483 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" (UID: "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.323179 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" (UID: "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.335909 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.335948 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkdrl\" (UniqueName: \"kubernetes.io/projected/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-kube-api-access-kkdrl\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.335961 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.335974 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.335986 4845 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.371398 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-config-data" (OuterVolumeSpecName: "config-data") pod "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" (UID: "706929e7-8bd8-42dc-ab60-62c8ddf6a3f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.437971 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.601158 4845 generic.go:334] "Generic (PLEG): container finished" podID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerID="69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61" exitCode=0 Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.601202 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2","Type":"ContainerDied","Data":"69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61"} Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.601234 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"706929e7-8bd8-42dc-ab60-62c8ddf6a3f2","Type":"ContainerDied","Data":"51c1f9d0284b90dd497cba3a9807962d595ad67fb2168dc0f4253f7de7b7a87b"} Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.601255 4845 scope.go:117] "RemoveContainer" containerID="6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.601410 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.627201 4845 scope.go:117] "RemoveContainer" containerID="a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.657970 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.685902 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.692946 4845 scope.go:117] "RemoveContainer" containerID="69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.702466 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:03:48 crc kubenswrapper[4845]: E1006 07:03:48.702878 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="proxy-httpd" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.702890 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="proxy-httpd" Oct 06 07:03:48 crc kubenswrapper[4845]: E1006 07:03:48.702925 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="ceilometer-central-agent" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.702930 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="ceilometer-central-agent" Oct 06 07:03:48 crc kubenswrapper[4845]: E1006 07:03:48.702938 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="ceilometer-notification-agent" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.702944 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="ceilometer-notification-agent" Oct 06 07:03:48 crc kubenswrapper[4845]: E1006 07:03:48.702959 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="sg-core" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.702966 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="sg-core" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.703142 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="proxy-httpd" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.703164 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="sg-core" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.703176 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="ceilometer-notification-agent" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.703188 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" containerName="ceilometer-central-agent" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.704822 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.713273 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.716602 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.716691 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.718558 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.723042 4845 scope.go:117] "RemoveContainer" containerID="d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.845232 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.845523 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krkd5\" (UniqueName: \"kubernetes.io/projected/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-kube-api-access-krkd5\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.845761 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-config-data\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.845818 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-log-httpd\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.845886 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-run-httpd\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.845926 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.846045 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.846147 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-scripts\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.881465 4845 scope.go:117] "RemoveContainer" containerID="6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d" Oct 06 07:03:48 crc kubenswrapper[4845]: E1006 07:03:48.882203 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d\": container with ID starting with 6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d not found: ID does not exist" containerID="6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.882238 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d"} err="failed to get container status \"6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d\": rpc error: code = NotFound desc = could not find container \"6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d\": container with ID starting with 6972853ad13d6b588ccb72cd1c546b3f8b3c82d5793ea9c3418920ee8ebe5c7d not found: ID does not exist" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.882267 4845 scope.go:117] "RemoveContainer" containerID="a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a" Oct 06 07:03:48 crc kubenswrapper[4845]: E1006 07:03:48.882906 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a\": container with ID starting with a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a not found: ID does not exist" containerID="a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.882951 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a"} err="failed to get container status \"a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a\": rpc error: code = NotFound desc = could not find container \"a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a\": container with ID starting with a406e9f3d92f9a261f69a9f54dcf9612cd05e7804dbc6505c0a4115d7c02c34a not found: ID does not exist" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.882972 4845 scope.go:117] "RemoveContainer" containerID="69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61" Oct 06 07:03:48 crc kubenswrapper[4845]: E1006 07:03:48.883194 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61\": container with ID starting with 69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61 not found: ID does not exist" containerID="69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.883220 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61"} err="failed to get container status \"69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61\": rpc error: code = NotFound desc = could not find container \"69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61\": container with ID starting with 69b52cc998ba310ccb996e370009c0b07f90bb923815e6afa911912728d61d61 not found: ID does not exist" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.883239 4845 scope.go:117] "RemoveContainer" containerID="d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025" Oct 06 07:03:48 crc kubenswrapper[4845]: E1006 07:03:48.883509 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025\": container with ID starting with d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025 not found: ID does not exist" containerID="d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.883540 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025"} err="failed to get container status \"d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025\": rpc error: code = NotFound desc = could not find container \"d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025\": container with ID starting with d25e9fd0766b138795e127a48526d59a6c74f03dde150270047df9acec9bc025 not found: ID does not exist" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.947916 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.948036 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krkd5\" (UniqueName: \"kubernetes.io/projected/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-kube-api-access-krkd5\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.948114 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-config-data\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.948137 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-log-httpd\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.948183 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-run-httpd\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.948209 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.948265 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.948302 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-scripts\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.948726 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-run-httpd\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.949172 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-log-httpd\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.952756 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-scripts\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.953053 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.956663 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-config-data\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.957855 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.967000 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krkd5\" (UniqueName: \"kubernetes.io/projected/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-kube-api-access-krkd5\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:48 crc kubenswrapper[4845]: I1006 07:03:48.975460 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66468bd9-d0ea-4117-a963-3e7fb9b3c54d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66468bd9-d0ea-4117-a963-3e7fb9b3c54d\") " pod="openstack/ceilometer-0" Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.109467 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.162115 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.254501 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc2qm\" (UniqueName: \"kubernetes.io/projected/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-kube-api-access-rc2qm\") pod \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.254554 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-logs\") pod \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.254629 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-config-data\") pod \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.255145 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-logs" (OuterVolumeSpecName: "logs") pod "2b4b64cd-a047-4723-9dfd-af1dc10bbecf" (UID: "2b4b64cd-a047-4723-9dfd-af1dc10bbecf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.255364 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-combined-ca-bundle\") pod \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\" (UID: \"2b4b64cd-a047-4723-9dfd-af1dc10bbecf\") " Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.256094 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.259188 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-kube-api-access-rc2qm" (OuterVolumeSpecName: "kube-api-access-rc2qm") pod "2b4b64cd-a047-4723-9dfd-af1dc10bbecf" (UID: "2b4b64cd-a047-4723-9dfd-af1dc10bbecf"). InnerVolumeSpecName "kube-api-access-rc2qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.287595 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b4b64cd-a047-4723-9dfd-af1dc10bbecf" (UID: "2b4b64cd-a047-4723-9dfd-af1dc10bbecf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.293521 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-config-data" (OuterVolumeSpecName: "config-data") pod "2b4b64cd-a047-4723-9dfd-af1dc10bbecf" (UID: "2b4b64cd-a047-4723-9dfd-af1dc10bbecf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.357333 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.357389 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:49 crc kubenswrapper[4845]: I1006 07:03:49.357402 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc2qm\" (UniqueName: \"kubernetes.io/projected/2b4b64cd-a047-4723-9dfd-af1dc10bbecf-kube-api-access-rc2qm\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.610349 4845 generic.go:334] "Generic (PLEG): container finished" podID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" containerID="d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc" exitCode=0 Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.610410 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b4b64cd-a047-4723-9dfd-af1dc10bbecf","Type":"ContainerDied","Data":"d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc"} Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.610805 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2b4b64cd-a047-4723-9dfd-af1dc10bbecf","Type":"ContainerDied","Data":"825e4781de952a275be3b3d688dcdd38ed7d8687eff7c349e09c7dde7a65de39"} Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.610829 4845 scope.go:117] "RemoveContainer" containerID="d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.610476 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.679574 4845 scope.go:117] "RemoveContainer" containerID="d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.688830 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.737719 4845 scope.go:117] "RemoveContainer" containerID="d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc" Oct 06 07:03:50 crc kubenswrapper[4845]: E1006 07:03:49.741457 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc\": container with ID starting with d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc not found: ID does not exist" containerID="d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.741494 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc"} err="failed to get container status \"d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc\": rpc error: code = NotFound desc = could not find container \"d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc\": container with ID starting with d1984a414f467e2914873ea0233ecfb3482b285b7941ceb3af91b389dd5e55fc not found: ID does not exist" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.741516 4845 scope.go:117] "RemoveContainer" containerID="d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6" Oct 06 07:03:50 crc kubenswrapper[4845]: E1006 07:03:49.745561 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6\": container with ID starting with d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6 not found: ID does not exist" containerID="d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.745607 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6"} err="failed to get container status \"d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6\": rpc error: code = NotFound desc = could not find container \"d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6\": container with ID starting with d986056f321eb8b3db6603aeae21d99a4b92c36a7d8f260ba4f0d4b8676b1aa6 not found: ID does not exist" Oct 06 07:03:50 crc kubenswrapper[4845]: W1006 07:03:49.757510 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66468bd9_d0ea_4117_a963_3e7fb9b3c54d.slice/crio-0b33eb73916946cf65f3559e98101f12f7f5982ef072d54ecf0aa80fbd62f41c WatchSource:0}: Error finding container 0b33eb73916946cf65f3559e98101f12f7f5982ef072d54ecf0aa80fbd62f41c: Status 404 returned error can't find the container with id 0b33eb73916946cf65f3559e98101f12f7f5982ef072d54ecf0aa80fbd62f41c Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.761432 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.791937 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:50 crc kubenswrapper[4845]: E1006 07:03:49.792690 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" containerName="nova-api-log" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.792708 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" containerName="nova-api-log" Oct 06 07:03:50 crc kubenswrapper[4845]: E1006 07:03:49.792768 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" containerName="nova-api-api" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.792778 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" containerName="nova-api-api" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.793086 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" containerName="nova-api-api" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.793109 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" containerName="nova-api-log" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.795283 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.807485 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.808474 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.808768 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.852716 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.860764 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.877817 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-public-tls-certs\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.877868 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-config-data\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.877886 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fm62\" (UniqueName: \"kubernetes.io/projected/53664522-1edb-4c52-9931-3cf6c7239112-kube-api-access-5fm62\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.878163 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53664522-1edb-4c52-9931-3cf6c7239112-logs\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.878256 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.878317 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-internal-tls-certs\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.980283 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-public-tls-certs\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.980337 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-config-data\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.980356 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fm62\" (UniqueName: \"kubernetes.io/projected/53664522-1edb-4c52-9931-3cf6c7239112-kube-api-access-5fm62\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.980497 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53664522-1edb-4c52-9931-3cf6c7239112-logs\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.980539 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.980568 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-internal-tls-certs\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.981536 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53664522-1edb-4c52-9931-3cf6c7239112-logs\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.984347 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-public-tls-certs\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.985651 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.986044 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-internal-tls-certs\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.988133 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-config-data\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:49.998447 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fm62\" (UniqueName: \"kubernetes.io/projected/53664522-1edb-4c52-9931-3cf6c7239112-kube-api-access-5fm62\") pod \"nova-api-0\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.154432 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.172533 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.211994 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.253735 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b4b64cd-a047-4723-9dfd-af1dc10bbecf" path="/var/lib/kubelet/pods/2b4b64cd-a047-4723-9dfd-af1dc10bbecf/volumes" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.254709 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706929e7-8bd8-42dc-ab60-62c8ddf6a3f2" path="/var/lib/kubelet/pods/706929e7-8bd8-42dc-ab60-62c8ddf6a3f2/volumes" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.628649 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:50 crc kubenswrapper[4845]: W1006 07:03:50.628970 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53664522_1edb_4c52_9931_3cf6c7239112.slice/crio-a7728c6183a1f35fcc213b519323e392633879d9228b4a4538099777edec2e53 WatchSource:0}: Error finding container a7728c6183a1f35fcc213b519323e392633879d9228b4a4538099777edec2e53: Status 404 returned error can't find the container with id a7728c6183a1f35fcc213b519323e392633879d9228b4a4538099777edec2e53 Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.633635 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66468bd9-d0ea-4117-a963-3e7fb9b3c54d","Type":"ContainerStarted","Data":"c66e6f0b5c698058dca4395fb816345bbfe59016668c9f1e6f420d43f70d805d"} Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.633675 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66468bd9-d0ea-4117-a963-3e7fb9b3c54d","Type":"ContainerStarted","Data":"0b33eb73916946cf65f3559e98101f12f7f5982ef072d54ecf0aa80fbd62f41c"} Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.649291 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.859949 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-bnlg8"] Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.861164 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.863107 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.863315 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.875427 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bnlg8"] Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.954813 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-config-data\") pod \"nova-cell1-cell-mapping-bnlg8\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.954882 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bnlg8\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.954980 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62ncs\" (UniqueName: \"kubernetes.io/projected/71e8ec4e-fe1b-46eb-9a91-e13178876378-kube-api-access-62ncs\") pod \"nova-cell1-cell-mapping-bnlg8\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:50 crc kubenswrapper[4845]: I1006 07:03:50.955052 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-scripts\") pod \"nova-cell1-cell-mapping-bnlg8\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.058328 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-config-data\") pod \"nova-cell1-cell-mapping-bnlg8\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.058461 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bnlg8\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.058506 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62ncs\" (UniqueName: \"kubernetes.io/projected/71e8ec4e-fe1b-46eb-9a91-e13178876378-kube-api-access-62ncs\") pod \"nova-cell1-cell-mapping-bnlg8\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.058573 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-scripts\") pod \"nova-cell1-cell-mapping-bnlg8\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.066259 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-scripts\") pod \"nova-cell1-cell-mapping-bnlg8\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.082759 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62ncs\" (UniqueName: \"kubernetes.io/projected/71e8ec4e-fe1b-46eb-9a91-e13178876378-kube-api-access-62ncs\") pod \"nova-cell1-cell-mapping-bnlg8\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.086106 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-config-data\") pod \"nova-cell1-cell-mapping-bnlg8\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.095060 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bnlg8\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.184344 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.629469 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bnlg8"] Oct 06 07:03:51 crc kubenswrapper[4845]: W1006 07:03:51.638748 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71e8ec4e_fe1b_46eb_9a91_e13178876378.slice/crio-962c199588280a0090b8d0adc038053f632e2dd45cb06c866f5874eb7dd0d354 WatchSource:0}: Error finding container 962c199588280a0090b8d0adc038053f632e2dd45cb06c866f5874eb7dd0d354: Status 404 returned error can't find the container with id 962c199588280a0090b8d0adc038053f632e2dd45cb06c866f5874eb7dd0d354 Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.643107 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"53664522-1edb-4c52-9931-3cf6c7239112","Type":"ContainerStarted","Data":"c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce"} Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.643148 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"53664522-1edb-4c52-9931-3cf6c7239112","Type":"ContainerStarted","Data":"181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5"} Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.643448 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"53664522-1edb-4c52-9931-3cf6c7239112","Type":"ContainerStarted","Data":"a7728c6183a1f35fcc213b519323e392633879d9228b4a4538099777edec2e53"} Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.645298 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66468bd9-d0ea-4117-a963-3e7fb9b3c54d","Type":"ContainerStarted","Data":"cc1dfe257163f08f1a98237adcea5b304086e966f3056d0506959545b556f160"} Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.645335 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66468bd9-d0ea-4117-a963-3e7fb9b3c54d","Type":"ContainerStarted","Data":"e3f54323d0dcfc9cd33be03760a85ab6aa1a2499031fbfaa4b77426274220132"} Oct 06 07:03:51 crc kubenswrapper[4845]: I1006 07:03:51.678470 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6784506 podStartE2EDuration="2.6784506s" podCreationTimestamp="2025-10-06 07:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:03:51.667969203 +0000 UTC m=+1116.182710231" watchObservedRunningTime="2025-10-06 07:03:51.6784506 +0000 UTC m=+1116.193191608" Oct 06 07:03:52 crc kubenswrapper[4845]: I1006 07:03:52.656648 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bnlg8" event={"ID":"71e8ec4e-fe1b-46eb-9a91-e13178876378","Type":"ContainerStarted","Data":"9ec90fe4ab97cc5b720940614641aacb14ec677cc7789f10ec8c630706b0455a"} Oct 06 07:03:52 crc kubenswrapper[4845]: I1006 07:03:52.657108 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bnlg8" event={"ID":"71e8ec4e-fe1b-46eb-9a91-e13178876378","Type":"ContainerStarted","Data":"962c199588280a0090b8d0adc038053f632e2dd45cb06c866f5874eb7dd0d354"} Oct 06 07:03:52 crc kubenswrapper[4845]: I1006 07:03:52.677874 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-bnlg8" podStartSLOduration=2.677858645 podStartE2EDuration="2.677858645s" podCreationTimestamp="2025-10-06 07:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:03:52.675364969 +0000 UTC m=+1117.190105977" watchObservedRunningTime="2025-10-06 07:03:52.677858645 +0000 UTC m=+1117.192599653" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.018812 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.019330 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.031152 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.086328 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66c996698c-llswm"] Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.086616 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66c996698c-llswm" podUID="85319c4e-6277-4946-8fbd-aba39a453df8" containerName="dnsmasq-dns" containerID="cri-o://a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d" gracePeriod=10 Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.592360 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.667925 4845 generic.go:334] "Generic (PLEG): container finished" podID="85319c4e-6277-4946-8fbd-aba39a453df8" containerID="a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d" exitCode=0 Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.668022 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c996698c-llswm" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.668024 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c996698c-llswm" event={"ID":"85319c4e-6277-4946-8fbd-aba39a453df8","Type":"ContainerDied","Data":"a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d"} Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.668140 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c996698c-llswm" event={"ID":"85319c4e-6277-4946-8fbd-aba39a453df8","Type":"ContainerDied","Data":"867ed98228715d8a54162aba80790d82e8cccb2f75236953955fd9e080687116"} Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.668165 4845 scope.go:117] "RemoveContainer" containerID="a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.693834 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66468bd9-d0ea-4117-a963-3e7fb9b3c54d","Type":"ContainerStarted","Data":"b97455c6738588f8ae3fd0cd912df8b0a97efd1699bb9594ec5280bdb553ac0d"} Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.693898 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.728085 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.820277328 podStartE2EDuration="5.728060291s" podCreationTimestamp="2025-10-06 07:03:48 +0000 UTC" firstStartedPulling="2025-10-06 07:03:49.770599656 +0000 UTC m=+1114.285340664" lastFinishedPulling="2025-10-06 07:03:52.678382619 +0000 UTC m=+1117.193123627" observedRunningTime="2025-10-06 07:03:53.723148292 +0000 UTC m=+1118.237889300" watchObservedRunningTime="2025-10-06 07:03:53.728060291 +0000 UTC m=+1118.242801299" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.729330 4845 scope.go:117] "RemoveContainer" containerID="ed411d100b9b5b38d098c1a9ebdcf68acc0a25e4f7a537b7c34fe8b13a5ce08c" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.732809 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-config\") pod \"85319c4e-6277-4946-8fbd-aba39a453df8\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.732933 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-dns-swift-storage-0\") pod \"85319c4e-6277-4946-8fbd-aba39a453df8\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.733013 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-dns-svc\") pod \"85319c4e-6277-4946-8fbd-aba39a453df8\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.733114 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-ovsdbserver-sb\") pod \"85319c4e-6277-4946-8fbd-aba39a453df8\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.733165 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lhtf\" (UniqueName: \"kubernetes.io/projected/85319c4e-6277-4946-8fbd-aba39a453df8-kube-api-access-4lhtf\") pod \"85319c4e-6277-4946-8fbd-aba39a453df8\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.733202 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-ovsdbserver-nb\") pod \"85319c4e-6277-4946-8fbd-aba39a453df8\" (UID: \"85319c4e-6277-4946-8fbd-aba39a453df8\") " Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.744651 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85319c4e-6277-4946-8fbd-aba39a453df8-kube-api-access-4lhtf" (OuterVolumeSpecName: "kube-api-access-4lhtf") pod "85319c4e-6277-4946-8fbd-aba39a453df8" (UID: "85319c4e-6277-4946-8fbd-aba39a453df8"). InnerVolumeSpecName "kube-api-access-4lhtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.761559 4845 scope.go:117] "RemoveContainer" containerID="a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d" Oct 06 07:03:53 crc kubenswrapper[4845]: E1006 07:03:53.762157 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d\": container with ID starting with a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d not found: ID does not exist" containerID="a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.762200 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d"} err="failed to get container status \"a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d\": rpc error: code = NotFound desc = could not find container \"a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d\": container with ID starting with a8cdf2a930bae9e44dcbb9aae18bee7ac8badd880762b942fe71dd628533493d not found: ID does not exist" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.762257 4845 scope.go:117] "RemoveContainer" containerID="ed411d100b9b5b38d098c1a9ebdcf68acc0a25e4f7a537b7c34fe8b13a5ce08c" Oct 06 07:03:53 crc kubenswrapper[4845]: E1006 07:03:53.763168 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed411d100b9b5b38d098c1a9ebdcf68acc0a25e4f7a537b7c34fe8b13a5ce08c\": container with ID starting with ed411d100b9b5b38d098c1a9ebdcf68acc0a25e4f7a537b7c34fe8b13a5ce08c not found: ID does not exist" containerID="ed411d100b9b5b38d098c1a9ebdcf68acc0a25e4f7a537b7c34fe8b13a5ce08c" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.763248 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed411d100b9b5b38d098c1a9ebdcf68acc0a25e4f7a537b7c34fe8b13a5ce08c"} err="failed to get container status \"ed411d100b9b5b38d098c1a9ebdcf68acc0a25e4f7a537b7c34fe8b13a5ce08c\": rpc error: code = NotFound desc = could not find container \"ed411d100b9b5b38d098c1a9ebdcf68acc0a25e4f7a537b7c34fe8b13a5ce08c\": container with ID starting with ed411d100b9b5b38d098c1a9ebdcf68acc0a25e4f7a537b7c34fe8b13a5ce08c not found: ID does not exist" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.808209 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "85319c4e-6277-4946-8fbd-aba39a453df8" (UID: "85319c4e-6277-4946-8fbd-aba39a453df8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.809911 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-config" (OuterVolumeSpecName: "config") pod "85319c4e-6277-4946-8fbd-aba39a453df8" (UID: "85319c4e-6277-4946-8fbd-aba39a453df8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.821336 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "85319c4e-6277-4946-8fbd-aba39a453df8" (UID: "85319c4e-6277-4946-8fbd-aba39a453df8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.833687 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "85319c4e-6277-4946-8fbd-aba39a453df8" (UID: "85319c4e-6277-4946-8fbd-aba39a453df8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.835531 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.835559 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.835572 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.835585 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lhtf\" (UniqueName: \"kubernetes.io/projected/85319c4e-6277-4946-8fbd-aba39a453df8-kube-api-access-4lhtf\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.835595 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.840793 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85319c4e-6277-4946-8fbd-aba39a453df8" (UID: "85319c4e-6277-4946-8fbd-aba39a453df8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:03:53 crc kubenswrapper[4845]: I1006 07:03:53.937441 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85319c4e-6277-4946-8fbd-aba39a453df8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:54 crc kubenswrapper[4845]: I1006 07:03:54.000217 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66c996698c-llswm"] Oct 06 07:03:54 crc kubenswrapper[4845]: I1006 07:03:54.008675 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66c996698c-llswm"] Oct 06 07:03:54 crc kubenswrapper[4845]: I1006 07:03:54.239080 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85319c4e-6277-4946-8fbd-aba39a453df8" path="/var/lib/kubelet/pods/85319c4e-6277-4946-8fbd-aba39a453df8/volumes" Oct 06 07:03:56 crc kubenswrapper[4845]: I1006 07:03:56.732013 4845 generic.go:334] "Generic (PLEG): container finished" podID="71e8ec4e-fe1b-46eb-9a91-e13178876378" containerID="9ec90fe4ab97cc5b720940614641aacb14ec677cc7789f10ec8c630706b0455a" exitCode=0 Oct 06 07:03:56 crc kubenswrapper[4845]: I1006 07:03:56.732131 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bnlg8" event={"ID":"71e8ec4e-fe1b-46eb-9a91-e13178876378","Type":"ContainerDied","Data":"9ec90fe4ab97cc5b720940614641aacb14ec677cc7789f10ec8c630706b0455a"} Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.073610 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.237944 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62ncs\" (UniqueName: \"kubernetes.io/projected/71e8ec4e-fe1b-46eb-9a91-e13178876378-kube-api-access-62ncs\") pod \"71e8ec4e-fe1b-46eb-9a91-e13178876378\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.238189 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-scripts\") pod \"71e8ec4e-fe1b-46eb-9a91-e13178876378\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.238235 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-config-data\") pod \"71e8ec4e-fe1b-46eb-9a91-e13178876378\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.238289 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-combined-ca-bundle\") pod \"71e8ec4e-fe1b-46eb-9a91-e13178876378\" (UID: \"71e8ec4e-fe1b-46eb-9a91-e13178876378\") " Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.245509 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71e8ec4e-fe1b-46eb-9a91-e13178876378-kube-api-access-62ncs" (OuterVolumeSpecName: "kube-api-access-62ncs") pod "71e8ec4e-fe1b-46eb-9a91-e13178876378" (UID: "71e8ec4e-fe1b-46eb-9a91-e13178876378"). InnerVolumeSpecName "kube-api-access-62ncs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.245821 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-scripts" (OuterVolumeSpecName: "scripts") pod "71e8ec4e-fe1b-46eb-9a91-e13178876378" (UID: "71e8ec4e-fe1b-46eb-9a91-e13178876378"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.272750 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-config-data" (OuterVolumeSpecName: "config-data") pod "71e8ec4e-fe1b-46eb-9a91-e13178876378" (UID: "71e8ec4e-fe1b-46eb-9a91-e13178876378"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.276771 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71e8ec4e-fe1b-46eb-9a91-e13178876378" (UID: "71e8ec4e-fe1b-46eb-9a91-e13178876378"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.341706 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.341778 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.341795 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71e8ec4e-fe1b-46eb-9a91-e13178876378-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.341806 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62ncs\" (UniqueName: \"kubernetes.io/projected/71e8ec4e-fe1b-46eb-9a91-e13178876378-kube-api-access-62ncs\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.758953 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bnlg8" event={"ID":"71e8ec4e-fe1b-46eb-9a91-e13178876378","Type":"ContainerDied","Data":"962c199588280a0090b8d0adc038053f632e2dd45cb06c866f5874eb7dd0d354"} Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.758999 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="962c199588280a0090b8d0adc038053f632e2dd45cb06c866f5874eb7dd0d354" Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.759087 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bnlg8" Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.963162 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.963451 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="53664522-1edb-4c52-9931-3cf6c7239112" containerName="nova-api-log" containerID="cri-o://181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5" gracePeriod=30 Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.963520 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="53664522-1edb-4c52-9931-3cf6c7239112" containerName="nova-api-api" containerID="cri-o://c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce" gracePeriod=30 Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.978982 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:03:58 crc kubenswrapper[4845]: I1006 07:03:58.979203 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f3445051-6b05-4a57-8766-4f8f066510e2" containerName="nova-scheduler-scheduler" containerID="cri-o://581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5" gracePeriod=30 Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.016145 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.017021 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerName="nova-metadata-log" containerID="cri-o://4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f" gracePeriod=30 Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.017126 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerName="nova-metadata-metadata" containerID="cri-o://8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4" gracePeriod=30 Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.546543 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.586078 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-public-tls-certs\") pod \"53664522-1edb-4c52-9931-3cf6c7239112\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.586137 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53664522-1edb-4c52-9931-3cf6c7239112-logs\") pod \"53664522-1edb-4c52-9931-3cf6c7239112\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.586167 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-combined-ca-bundle\") pod \"53664522-1edb-4c52-9931-3cf6c7239112\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.586284 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-config-data\") pod \"53664522-1edb-4c52-9931-3cf6c7239112\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.586308 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-internal-tls-certs\") pod \"53664522-1edb-4c52-9931-3cf6c7239112\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.586468 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fm62\" (UniqueName: \"kubernetes.io/projected/53664522-1edb-4c52-9931-3cf6c7239112-kube-api-access-5fm62\") pod \"53664522-1edb-4c52-9931-3cf6c7239112\" (UID: \"53664522-1edb-4c52-9931-3cf6c7239112\") " Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.586839 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53664522-1edb-4c52-9931-3cf6c7239112-logs" (OuterVolumeSpecName: "logs") pod "53664522-1edb-4c52-9931-3cf6c7239112" (UID: "53664522-1edb-4c52-9931-3cf6c7239112"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.594279 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53664522-1edb-4c52-9931-3cf6c7239112-kube-api-access-5fm62" (OuterVolumeSpecName: "kube-api-access-5fm62") pod "53664522-1edb-4c52-9931-3cf6c7239112" (UID: "53664522-1edb-4c52-9931-3cf6c7239112"). InnerVolumeSpecName "kube-api-access-5fm62". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.624076 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53664522-1edb-4c52-9931-3cf6c7239112" (UID: "53664522-1edb-4c52-9931-3cf6c7239112"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.646267 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-config-data" (OuterVolumeSpecName: "config-data") pod "53664522-1edb-4c52-9931-3cf6c7239112" (UID: "53664522-1edb-4c52-9931-3cf6c7239112"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.646741 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "53664522-1edb-4c52-9931-3cf6c7239112" (UID: "53664522-1edb-4c52-9931-3cf6c7239112"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.659596 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "53664522-1edb-4c52-9931-3cf6c7239112" (UID: "53664522-1edb-4c52-9931-3cf6c7239112"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.689868 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.689896 4845 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.689908 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fm62\" (UniqueName: \"kubernetes.io/projected/53664522-1edb-4c52-9931-3cf6c7239112-kube-api-access-5fm62\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.689916 4845 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.689924 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53664522-1edb-4c52-9931-3cf6c7239112-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.689932 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53664522-1edb-4c52-9931-3cf6c7239112-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.768188 4845 generic.go:334] "Generic (PLEG): container finished" podID="53664522-1edb-4c52-9931-3cf6c7239112" containerID="c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce" exitCode=0 Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.768221 4845 generic.go:334] "Generic (PLEG): container finished" podID="53664522-1edb-4c52-9931-3cf6c7239112" containerID="181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5" exitCode=143 Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.768260 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"53664522-1edb-4c52-9931-3cf6c7239112","Type":"ContainerDied","Data":"c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce"} Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.768286 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"53664522-1edb-4c52-9931-3cf6c7239112","Type":"ContainerDied","Data":"181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5"} Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.768296 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"53664522-1edb-4c52-9931-3cf6c7239112","Type":"ContainerDied","Data":"a7728c6183a1f35fcc213b519323e392633879d9228b4a4538099777edec2e53"} Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.768310 4845 scope.go:117] "RemoveContainer" containerID="c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.768426 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.777885 4845 generic.go:334] "Generic (PLEG): container finished" podID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerID="4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f" exitCode=143 Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.777934 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14928035-28fc-46c2-ade7-cc9f24cd0660","Type":"ContainerDied","Data":"4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f"} Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.813480 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.826148 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.834615 4845 scope.go:117] "RemoveContainer" containerID="181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.856175 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:59 crc kubenswrapper[4845]: E1006 07:03:59.856587 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e8ec4e-fe1b-46eb-9a91-e13178876378" containerName="nova-manage" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.856604 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e8ec4e-fe1b-46eb-9a91-e13178876378" containerName="nova-manage" Oct 06 07:03:59 crc kubenswrapper[4845]: E1006 07:03:59.856623 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53664522-1edb-4c52-9931-3cf6c7239112" containerName="nova-api-log" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.856632 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="53664522-1edb-4c52-9931-3cf6c7239112" containerName="nova-api-log" Oct 06 07:03:59 crc kubenswrapper[4845]: E1006 07:03:59.856645 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85319c4e-6277-4946-8fbd-aba39a453df8" containerName="dnsmasq-dns" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.856652 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="85319c4e-6277-4946-8fbd-aba39a453df8" containerName="dnsmasq-dns" Oct 06 07:03:59 crc kubenswrapper[4845]: E1006 07:03:59.856681 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53664522-1edb-4c52-9931-3cf6c7239112" containerName="nova-api-api" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.856686 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="53664522-1edb-4c52-9931-3cf6c7239112" containerName="nova-api-api" Oct 06 07:03:59 crc kubenswrapper[4845]: E1006 07:03:59.856699 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85319c4e-6277-4946-8fbd-aba39a453df8" containerName="init" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.856707 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="85319c4e-6277-4946-8fbd-aba39a453df8" containerName="init" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.856885 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="53664522-1edb-4c52-9931-3cf6c7239112" containerName="nova-api-api" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.856901 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="85319c4e-6277-4946-8fbd-aba39a453df8" containerName="dnsmasq-dns" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.856930 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="53664522-1edb-4c52-9931-3cf6c7239112" containerName="nova-api-log" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.856943 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="71e8ec4e-fe1b-46eb-9a91-e13178876378" containerName="nova-manage" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.857956 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.860738 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.860827 4845 scope.go:117] "RemoveContainer" containerID="c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.860928 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.861582 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 07:03:59 crc kubenswrapper[4845]: E1006 07:03:59.865132 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce\": container with ID starting with c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce not found: ID does not exist" containerID="c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.865182 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce"} err="failed to get container status \"c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce\": rpc error: code = NotFound desc = could not find container \"c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce\": container with ID starting with c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce not found: ID does not exist" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.865216 4845 scope.go:117] "RemoveContainer" containerID="181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5" Oct 06 07:03:59 crc kubenswrapper[4845]: E1006 07:03:59.866616 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5\": container with ID starting with 181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5 not found: ID does not exist" containerID="181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.866671 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5"} err="failed to get container status \"181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5\": rpc error: code = NotFound desc = could not find container \"181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5\": container with ID starting with 181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5 not found: ID does not exist" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.866696 4845 scope.go:117] "RemoveContainer" containerID="c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.867133 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce"} err="failed to get container status \"c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce\": rpc error: code = NotFound desc = could not find container \"c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce\": container with ID starting with c4872837a29e35fbc83f30950209f2aa3a78cee4dacf56f243147ad3bcd778ce not found: ID does not exist" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.867166 4845 scope.go:117] "RemoveContainer" containerID="181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.867529 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5"} err="failed to get container status \"181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5\": rpc error: code = NotFound desc = could not find container \"181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5\": container with ID starting with 181f0dee4c9b12db1844a3f68bb3168ba9b3e0d2a0814aae2c447ff80a7d7cb5 not found: ID does not exist" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.876288 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.901447 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-public-tls-certs\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.901598 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-logs\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.901632 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-config-data\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.901745 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll2jk\" (UniqueName: \"kubernetes.io/projected/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-kube-api-access-ll2jk\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.901887 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:03:59 crc kubenswrapper[4845]: I1006 07:03:59.902011 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:03:59 crc kubenswrapper[4845]: E1006 07:03:59.934064 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53664522_1edb_4c52_9931_3cf6c7239112.slice\": RecentStats: unable to find data in memory cache]" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.004223 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-logs\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.004268 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-config-data\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.004303 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll2jk\" (UniqueName: \"kubernetes.io/projected/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-kube-api-access-ll2jk\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.004354 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.004396 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.004458 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-public-tls-certs\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.004884 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-logs\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.008368 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.008886 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-config-data\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.008911 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.008936 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-public-tls-certs\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.020677 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll2jk\" (UniqueName: \"kubernetes.io/projected/20195d0c-d1c3-476e-86fa-2bc4d2ab39d3-kube-api-access-ll2jk\") pod \"nova-api-0\" (UID: \"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3\") " pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.179356 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.244965 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53664522-1edb-4c52-9931-3cf6c7239112" path="/var/lib/kubelet/pods/53664522-1edb-4c52-9931-3cf6c7239112/volumes" Oct 06 07:04:00 crc kubenswrapper[4845]: E1006 07:04:00.599520 4845 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 07:04:00 crc kubenswrapper[4845]: E1006 07:04:00.601836 4845 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 07:04:00 crc kubenswrapper[4845]: E1006 07:04:00.603325 4845 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 07:04:00 crc kubenswrapper[4845]: E1006 07:04:00.603361 4845 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f3445051-6b05-4a57-8766-4f8f066510e2" containerName="nova-scheduler-scheduler" Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.623188 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 07:04:00 crc kubenswrapper[4845]: I1006 07:04:00.791167 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3","Type":"ContainerStarted","Data":"557bde4df5276d665c528f2fb982dbcc01ef4c8f77138f5d057b812c18d0f3b9"} Oct 06 07:04:01 crc kubenswrapper[4845]: I1006 07:04:01.803551 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3","Type":"ContainerStarted","Data":"bd32702e807c81ae6e2cdff9521ec47ca4e263177937ace1db88822a29932cfc"} Oct 06 07:04:01 crc kubenswrapper[4845]: I1006 07:04:01.803893 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20195d0c-d1c3-476e-86fa-2bc4d2ab39d3","Type":"ContainerStarted","Data":"7c6637840f3b824c5cd905423f64b2bea5923895ef394384645294b57b108788"} Oct 06 07:04:01 crc kubenswrapper[4845]: I1006 07:04:01.840266 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.840245544 podStartE2EDuration="2.840245544s" podCreationTimestamp="2025-10-06 07:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:04:01.826239115 +0000 UTC m=+1126.340980153" watchObservedRunningTime="2025-10-06 07:04:01.840245544 +0000 UTC m=+1126.354986552" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.581590 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.672151 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-nova-metadata-tls-certs\") pod \"14928035-28fc-46c2-ade7-cc9f24cd0660\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.672274 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rck7h\" (UniqueName: \"kubernetes.io/projected/14928035-28fc-46c2-ade7-cc9f24cd0660-kube-api-access-rck7h\") pod \"14928035-28fc-46c2-ade7-cc9f24cd0660\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.672309 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-config-data\") pod \"14928035-28fc-46c2-ade7-cc9f24cd0660\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.672460 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14928035-28fc-46c2-ade7-cc9f24cd0660-logs\") pod \"14928035-28fc-46c2-ade7-cc9f24cd0660\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.672486 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-combined-ca-bundle\") pod \"14928035-28fc-46c2-ade7-cc9f24cd0660\" (UID: \"14928035-28fc-46c2-ade7-cc9f24cd0660\") " Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.673036 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14928035-28fc-46c2-ade7-cc9f24cd0660-logs" (OuterVolumeSpecName: "logs") pod "14928035-28fc-46c2-ade7-cc9f24cd0660" (UID: "14928035-28fc-46c2-ade7-cc9f24cd0660"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.686181 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14928035-28fc-46c2-ade7-cc9f24cd0660-kube-api-access-rck7h" (OuterVolumeSpecName: "kube-api-access-rck7h") pod "14928035-28fc-46c2-ade7-cc9f24cd0660" (UID: "14928035-28fc-46c2-ade7-cc9f24cd0660"). InnerVolumeSpecName "kube-api-access-rck7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.705361 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14928035-28fc-46c2-ade7-cc9f24cd0660" (UID: "14928035-28fc-46c2-ade7-cc9f24cd0660"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.707666 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-config-data" (OuterVolumeSpecName: "config-data") pod "14928035-28fc-46c2-ade7-cc9f24cd0660" (UID: "14928035-28fc-46c2-ade7-cc9f24cd0660"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.734959 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "14928035-28fc-46c2-ade7-cc9f24cd0660" (UID: "14928035-28fc-46c2-ade7-cc9f24cd0660"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.774612 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rck7h\" (UniqueName: \"kubernetes.io/projected/14928035-28fc-46c2-ade7-cc9f24cd0660-kube-api-access-rck7h\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.774649 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.774665 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14928035-28fc-46c2-ade7-cc9f24cd0660-logs\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.774678 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.774689 4845 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14928035-28fc-46c2-ade7-cc9f24cd0660-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.819343 4845 generic.go:334] "Generic (PLEG): container finished" podID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerID="8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4" exitCode=0 Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.819414 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14928035-28fc-46c2-ade7-cc9f24cd0660","Type":"ContainerDied","Data":"8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4"} Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.819473 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14928035-28fc-46c2-ade7-cc9f24cd0660","Type":"ContainerDied","Data":"d7a7fffe9f51e0854895fc8cce331c1a94f9ebd90d5c4e16c3c9dd11e10d2667"} Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.819492 4845 scope.go:117] "RemoveContainer" containerID="8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.819521 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.858341 4845 scope.go:117] "RemoveContainer" containerID="4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.865415 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.879942 4845 scope.go:117] "RemoveContainer" containerID="8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4" Oct 06 07:04:02 crc kubenswrapper[4845]: E1006 07:04:02.880633 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4\": container with ID starting with 8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4 not found: ID does not exist" containerID="8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.880685 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4"} err="failed to get container status \"8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4\": rpc error: code = NotFound desc = could not find container \"8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4\": container with ID starting with 8dbb13cbaf28c2d7893574825aedb1a3d0f6564ea6a9e08ae8287203c21a11e4 not found: ID does not exist" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.880716 4845 scope.go:117] "RemoveContainer" containerID="4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.881056 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:04:02 crc kubenswrapper[4845]: E1006 07:04:02.881237 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f\": container with ID starting with 4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f not found: ID does not exist" containerID="4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.881264 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f"} err="failed to get container status \"4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f\": rpc error: code = NotFound desc = could not find container \"4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f\": container with ID starting with 4c26cfc3687dbbd3ab44bb03c190efdd3b7dead4682e0a5673e98c0868e3d64f not found: ID does not exist" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.899621 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:04:02 crc kubenswrapper[4845]: E1006 07:04:02.900107 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerName="nova-metadata-metadata" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.900126 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerName="nova-metadata-metadata" Oct 06 07:04:02 crc kubenswrapper[4845]: E1006 07:04:02.900141 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerName="nova-metadata-log" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.900148 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerName="nova-metadata-log" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.900317 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerName="nova-metadata-metadata" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.900330 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerName="nova-metadata-log" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.901309 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.902973 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.903143 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.920318 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.980164 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-logs\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.980224 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdk89\" (UniqueName: \"kubernetes.io/projected/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-kube-api-access-jdk89\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.980243 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.980908 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:02 crc kubenswrapper[4845]: I1006 07:04:02.980995 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-config-data\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.082246 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-config-data\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.082337 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-logs\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.082854 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-logs\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.082928 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdk89\" (UniqueName: \"kubernetes.io/projected/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-kube-api-access-jdk89\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.082960 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.083113 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.086631 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.087757 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.088836 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-config-data\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.108343 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdk89\" (UniqueName: \"kubernetes.io/projected/1fe4c060-2cea-4178-a1ec-33cf60f56ef8-kube-api-access-jdk89\") pod \"nova-metadata-0\" (UID: \"1fe4c060-2cea-4178-a1ec-33cf60f56ef8\") " pod="openstack/nova-metadata-0" Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.227229 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 07:04:03 crc kubenswrapper[4845]: W1006 07:04:03.739677 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fe4c060_2cea_4178_a1ec_33cf60f56ef8.slice/crio-75182cf93afd0ecce229071258f567d401fa4bca2855f37f3bcac8df1e89ace6 WatchSource:0}: Error finding container 75182cf93afd0ecce229071258f567d401fa4bca2855f37f3bcac8df1e89ace6: Status 404 returned error can't find the container with id 75182cf93afd0ecce229071258f567d401fa4bca2855f37f3bcac8df1e89ace6 Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.747907 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 07:04:03 crc kubenswrapper[4845]: I1006 07:04:03.835141 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1fe4c060-2cea-4178-a1ec-33cf60f56ef8","Type":"ContainerStarted","Data":"75182cf93afd0ecce229071258f567d401fa4bca2855f37f3bcac8df1e89ace6"} Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.239522 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" path="/var/lib/kubelet/pods/14928035-28fc-46c2-ade7-cc9f24cd0660/volumes" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.729003 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.817907 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3445051-6b05-4a57-8766-4f8f066510e2-config-data\") pod \"f3445051-6b05-4a57-8766-4f8f066510e2\" (UID: \"f3445051-6b05-4a57-8766-4f8f066510e2\") " Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.818064 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnglz\" (UniqueName: \"kubernetes.io/projected/f3445051-6b05-4a57-8766-4f8f066510e2-kube-api-access-cnglz\") pod \"f3445051-6b05-4a57-8766-4f8f066510e2\" (UID: \"f3445051-6b05-4a57-8766-4f8f066510e2\") " Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.818137 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3445051-6b05-4a57-8766-4f8f066510e2-combined-ca-bundle\") pod \"f3445051-6b05-4a57-8766-4f8f066510e2\" (UID: \"f3445051-6b05-4a57-8766-4f8f066510e2\") " Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.823305 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3445051-6b05-4a57-8766-4f8f066510e2-kube-api-access-cnglz" (OuterVolumeSpecName: "kube-api-access-cnglz") pod "f3445051-6b05-4a57-8766-4f8f066510e2" (UID: "f3445051-6b05-4a57-8766-4f8f066510e2"). InnerVolumeSpecName "kube-api-access-cnglz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.846456 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3445051-6b05-4a57-8766-4f8f066510e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3445051-6b05-4a57-8766-4f8f066510e2" (UID: "f3445051-6b05-4a57-8766-4f8f066510e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.848677 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3445051-6b05-4a57-8766-4f8f066510e2-config-data" (OuterVolumeSpecName: "config-data") pod "f3445051-6b05-4a57-8766-4f8f066510e2" (UID: "f3445051-6b05-4a57-8766-4f8f066510e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.850198 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1fe4c060-2cea-4178-a1ec-33cf60f56ef8","Type":"ContainerStarted","Data":"26abcec5512ee69652db2d8441a584d7d2e57544b430754a373170af0b57d914"} Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.850238 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1fe4c060-2cea-4178-a1ec-33cf60f56ef8","Type":"ContainerStarted","Data":"88e82b28ece4d9cf028368cf08fcea1c71ae5506cabe6a7584ec0e4770825536"} Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.854993 4845 generic.go:334] "Generic (PLEG): container finished" podID="f3445051-6b05-4a57-8766-4f8f066510e2" containerID="581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5" exitCode=0 Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.855154 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f3445051-6b05-4a57-8766-4f8f066510e2","Type":"ContainerDied","Data":"581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5"} Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.855251 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f3445051-6b05-4a57-8766-4f8f066510e2","Type":"ContainerDied","Data":"a5d20692cd55a1bff257cfb2486a188fcf8c96475ff83039d7fe53877f9abfe6"} Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.855359 4845 scope.go:117] "RemoveContainer" containerID="581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.855613 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.869495 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.86947559 podStartE2EDuration="2.86947559s" podCreationTimestamp="2025-10-06 07:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:04:04.868841863 +0000 UTC m=+1129.383582891" watchObservedRunningTime="2025-10-06 07:04:04.86947559 +0000 UTC m=+1129.384216618" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.899153 4845 scope.go:117] "RemoveContainer" containerID="581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5" Oct 06 07:04:04 crc kubenswrapper[4845]: E1006 07:04:04.899734 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5\": container with ID starting with 581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5 not found: ID does not exist" containerID="581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.899778 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5"} err="failed to get container status \"581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5\": rpc error: code = NotFound desc = could not find container \"581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5\": container with ID starting with 581b89cd7b16b4e79c10ed8b5878e1d779a532ed6d0907ba99230c5504cf2dc5 not found: ID does not exist" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.928394 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3445051-6b05-4a57-8766-4f8f066510e2-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.928422 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnglz\" (UniqueName: \"kubernetes.io/projected/f3445051-6b05-4a57-8766-4f8f066510e2-kube-api-access-cnglz\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.928432 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3445051-6b05-4a57-8766-4f8f066510e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.933046 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.956055 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.963762 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:04:04 crc kubenswrapper[4845]: E1006 07:04:04.964227 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3445051-6b05-4a57-8766-4f8f066510e2" containerName="nova-scheduler-scheduler" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.964246 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3445051-6b05-4a57-8766-4f8f066510e2" containerName="nova-scheduler-scheduler" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.964451 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3445051-6b05-4a57-8766-4f8f066510e2" containerName="nova-scheduler-scheduler" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.965545 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.967565 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 07:04:04 crc kubenswrapper[4845]: I1006 07:04:04.970938 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.029740 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdrbp\" (UniqueName: \"kubernetes.io/projected/b3d70a5b-fbdb-4d75-bc33-6fef87a933c6-kube-api-access-tdrbp\") pod \"nova-scheduler-0\" (UID: \"b3d70a5b-fbdb-4d75-bc33-6fef87a933c6\") " pod="openstack/nova-scheduler-0" Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.029802 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d70a5b-fbdb-4d75-bc33-6fef87a933c6-config-data\") pod \"nova-scheduler-0\" (UID: \"b3d70a5b-fbdb-4d75-bc33-6fef87a933c6\") " pod="openstack/nova-scheduler-0" Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.030105 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d70a5b-fbdb-4d75-bc33-6fef87a933c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3d70a5b-fbdb-4d75-bc33-6fef87a933c6\") " pod="openstack/nova-scheduler-0" Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.132085 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdrbp\" (UniqueName: \"kubernetes.io/projected/b3d70a5b-fbdb-4d75-bc33-6fef87a933c6-kube-api-access-tdrbp\") pod \"nova-scheduler-0\" (UID: \"b3d70a5b-fbdb-4d75-bc33-6fef87a933c6\") " pod="openstack/nova-scheduler-0" Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.132143 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d70a5b-fbdb-4d75-bc33-6fef87a933c6-config-data\") pod \"nova-scheduler-0\" (UID: \"b3d70a5b-fbdb-4d75-bc33-6fef87a933c6\") " pod="openstack/nova-scheduler-0" Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.132216 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d70a5b-fbdb-4d75-bc33-6fef87a933c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3d70a5b-fbdb-4d75-bc33-6fef87a933c6\") " pod="openstack/nova-scheduler-0" Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.136238 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3d70a5b-fbdb-4d75-bc33-6fef87a933c6-config-data\") pod \"nova-scheduler-0\" (UID: \"b3d70a5b-fbdb-4d75-bc33-6fef87a933c6\") " pod="openstack/nova-scheduler-0" Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.136761 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d70a5b-fbdb-4d75-bc33-6fef87a933c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3d70a5b-fbdb-4d75-bc33-6fef87a933c6\") " pod="openstack/nova-scheduler-0" Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.148975 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdrbp\" (UniqueName: \"kubernetes.io/projected/b3d70a5b-fbdb-4d75-bc33-6fef87a933c6-kube-api-access-tdrbp\") pod \"nova-scheduler-0\" (UID: \"b3d70a5b-fbdb-4d75-bc33-6fef87a933c6\") " pod="openstack/nova-scheduler-0" Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.285384 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.689087 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 07:04:05 crc kubenswrapper[4845]: W1006 07:04:05.689600 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3d70a5b_fbdb_4d75_bc33_6fef87a933c6.slice/crio-ce5fda49e9f61dda6014615d931477439ee5a4a64cf62290d5be8c1837950ccd WatchSource:0}: Error finding container ce5fda49e9f61dda6014615d931477439ee5a4a64cf62290d5be8c1837950ccd: Status 404 returned error can't find the container with id ce5fda49e9f61dda6014615d931477439ee5a4a64cf62290d5be8c1837950ccd Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.864863 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3d70a5b-fbdb-4d75-bc33-6fef87a933c6","Type":"ContainerStarted","Data":"bb2ee32967d440c1ad825c90f0e92c7d849c148da1fb9e1f7a84e9910ed5701c"} Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.865215 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3d70a5b-fbdb-4d75-bc33-6fef87a933c6","Type":"ContainerStarted","Data":"ce5fda49e9f61dda6014615d931477439ee5a4a64cf62290d5be8c1837950ccd"} Oct 06 07:04:05 crc kubenswrapper[4845]: I1006 07:04:05.885666 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.885649919 podStartE2EDuration="1.885649919s" podCreationTimestamp="2025-10-06 07:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:04:05.883719458 +0000 UTC m=+1130.398460466" watchObservedRunningTime="2025-10-06 07:04:05.885649919 +0000 UTC m=+1130.400390927" Oct 06 07:04:06 crc kubenswrapper[4845]: I1006 07:04:06.247732 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3445051-6b05-4a57-8766-4f8f066510e2" path="/var/lib/kubelet/pods/f3445051-6b05-4a57-8766-4f8f066510e2/volumes" Oct 06 07:04:07 crc kubenswrapper[4845]: I1006 07:04:07.572058 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 07:04:07 crc kubenswrapper[4845]: I1006 07:04:07.572209 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="14928035-28fc-46c2-ade7-cc9f24cd0660" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 07:04:08 crc kubenswrapper[4845]: I1006 07:04:08.246889 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 07:04:08 crc kubenswrapper[4845]: I1006 07:04:08.246927 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 07:04:10 crc kubenswrapper[4845]: I1006 07:04:10.179993 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 07:04:10 crc kubenswrapper[4845]: I1006 07:04:10.180325 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 07:04:10 crc kubenswrapper[4845]: I1006 07:04:10.301042 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 07:04:11 crc kubenswrapper[4845]: I1006 07:04:11.195498 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="20195d0c-d1c3-476e-86fa-2bc4d2ab39d3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 07:04:11 crc kubenswrapper[4845]: I1006 07:04:11.195497 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="20195d0c-d1c3-476e-86fa-2bc4d2ab39d3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 07:04:13 crc kubenswrapper[4845]: I1006 07:04:13.227954 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 07:04:13 crc kubenswrapper[4845]: I1006 07:04:13.228426 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 07:04:14 crc kubenswrapper[4845]: I1006 07:04:14.241612 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1fe4c060-2cea-4178-a1ec-33cf60f56ef8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 07:04:14 crc kubenswrapper[4845]: I1006 07:04:14.241637 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1fe4c060-2cea-4178-a1ec-33cf60f56ef8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 07:04:15 crc kubenswrapper[4845]: I1006 07:04:15.286243 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 07:04:15 crc kubenswrapper[4845]: I1006 07:04:15.322098 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 07:04:15 crc kubenswrapper[4845]: I1006 07:04:15.994564 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 07:04:19 crc kubenswrapper[4845]: I1006 07:04:19.181239 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 07:04:20 crc kubenswrapper[4845]: I1006 07:04:20.190911 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 07:04:20 crc kubenswrapper[4845]: I1006 07:04:20.191766 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 07:04:20 crc kubenswrapper[4845]: I1006 07:04:20.195334 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 07:04:20 crc kubenswrapper[4845]: I1006 07:04:20.198114 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 07:04:21 crc kubenswrapper[4845]: I1006 07:04:21.017125 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 07:04:21 crc kubenswrapper[4845]: I1006 07:04:21.030749 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 07:04:23 crc kubenswrapper[4845]: I1006 07:04:23.018938 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:04:23 crc kubenswrapper[4845]: I1006 07:04:23.019281 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:04:23 crc kubenswrapper[4845]: I1006 07:04:23.019326 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 07:04:23 crc kubenswrapper[4845]: I1006 07:04:23.019889 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a4030ef2b48fb5db1ac392bd2dae2cd42e97737e449c0b7f5beb300ab99f64c"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 07:04:23 crc kubenswrapper[4845]: I1006 07:04:23.019955 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://6a4030ef2b48fb5db1ac392bd2dae2cd42e97737e449c0b7f5beb300ab99f64c" gracePeriod=600 Oct 06 07:04:23 crc kubenswrapper[4845]: I1006 07:04:23.234737 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 07:04:23 crc kubenswrapper[4845]: I1006 07:04:23.236584 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 07:04:23 crc kubenswrapper[4845]: I1006 07:04:23.242398 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 07:04:23 crc kubenswrapper[4845]: I1006 07:04:23.244811 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 07:04:24 crc kubenswrapper[4845]: I1006 07:04:24.045443 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="6a4030ef2b48fb5db1ac392bd2dae2cd42e97737e449c0b7f5beb300ab99f64c" exitCode=0 Oct 06 07:04:24 crc kubenswrapper[4845]: I1006 07:04:24.045485 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"6a4030ef2b48fb5db1ac392bd2dae2cd42e97737e449c0b7f5beb300ab99f64c"} Oct 06 07:04:24 crc kubenswrapper[4845]: I1006 07:04:24.045869 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"747bfea9f095b79ec9738d4756f5ae602d44718e89fd27297d3071705491e54c"} Oct 06 07:04:24 crc kubenswrapper[4845]: I1006 07:04:24.045894 4845 scope.go:117] "RemoveContainer" containerID="f1a1b8d6a136dbd6653eb7b5058c9c79831c66bac509d23e8f8977bad3f0b842" Oct 06 07:04:30 crc kubenswrapper[4845]: I1006 07:04:30.897817 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 07:04:32 crc kubenswrapper[4845]: I1006 07:04:32.015529 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 07:04:35 crc kubenswrapper[4845]: I1006 07:04:35.539121 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6a669571-1ec3-4cb8-8a07-e20c31ca87e5" containerName="rabbitmq" containerID="cri-o://0ee17b80a502005f4a41fe516463e1629e18c66e11e5aa7120fc2e30bdcbc00e" gracePeriod=604796 Oct 06 07:04:35 crc kubenswrapper[4845]: I1006 07:04:35.926475 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="38d9a5cf-6de3-487c-a71c-374ca55ca525" containerName="rabbitmq" containerID="cri-o://b03e7f50e171be756612b0954404b099795066c4f9a924a451b446a1704bb5e8" gracePeriod=604797 Oct 06 07:04:36 crc kubenswrapper[4845]: I1006 07:04:36.089177 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6a669571-1ec3-4cb8-8a07-e20c31ca87e5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.201655 4845 generic.go:334] "Generic (PLEG): container finished" podID="6a669571-1ec3-4cb8-8a07-e20c31ca87e5" containerID="0ee17b80a502005f4a41fe516463e1629e18c66e11e5aa7120fc2e30bdcbc00e" exitCode=0 Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.201744 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a669571-1ec3-4cb8-8a07-e20c31ca87e5","Type":"ContainerDied","Data":"0ee17b80a502005f4a41fe516463e1629e18c66e11e5aa7120fc2e30bdcbc00e"} Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.203405 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a669571-1ec3-4cb8-8a07-e20c31ca87e5","Type":"ContainerDied","Data":"2d3d3c7fca4a33cfadbaec6a6408b923aa90ada90d3090433c20bdae3e5822d9"} Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.203478 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d3d3c7fca4a33cfadbaec6a6408b923aa90ada90d3090433c20bdae3e5822d9" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.205836 4845 generic.go:334] "Generic (PLEG): container finished" podID="38d9a5cf-6de3-487c-a71c-374ca55ca525" containerID="b03e7f50e171be756612b0954404b099795066c4f9a924a451b446a1704bb5e8" exitCode=0 Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.205878 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38d9a5cf-6de3-487c-a71c-374ca55ca525","Type":"ContainerDied","Data":"b03e7f50e171be756612b0954404b099795066c4f9a924a451b446a1704bb5e8"} Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.228708 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.385807 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-plugins\") pod \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.385873 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drdm6\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-kube-api-access-drdm6\") pod \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.385899 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-config-data\") pod \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.385972 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-confd\") pod \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.386008 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-tls\") pod \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.386064 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-erlang-cookie\") pod \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.386099 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-plugins-conf\") pod \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.386133 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-erlang-cookie-secret\") pod \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.386173 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-pod-info\") pod \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.386198 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.386238 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-server-conf\") pod \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\" (UID: \"6a669571-1ec3-4cb8-8a07-e20c31ca87e5\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.386388 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6a669571-1ec3-4cb8-8a07-e20c31ca87e5" (UID: "6a669571-1ec3-4cb8-8a07-e20c31ca87e5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.386954 4845 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.387100 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6a669571-1ec3-4cb8-8a07-e20c31ca87e5" (UID: "6a669571-1ec3-4cb8-8a07-e20c31ca87e5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.394667 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "6a669571-1ec3-4cb8-8a07-e20c31ca87e5" (UID: "6a669571-1ec3-4cb8-8a07-e20c31ca87e5"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.396897 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6a669571-1ec3-4cb8-8a07-e20c31ca87e5" (UID: "6a669571-1ec3-4cb8-8a07-e20c31ca87e5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.399944 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-pod-info" (OuterVolumeSpecName: "pod-info") pod "6a669571-1ec3-4cb8-8a07-e20c31ca87e5" (UID: "6a669571-1ec3-4cb8-8a07-e20c31ca87e5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.400100 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-kube-api-access-drdm6" (OuterVolumeSpecName: "kube-api-access-drdm6") pod "6a669571-1ec3-4cb8-8a07-e20c31ca87e5" (UID: "6a669571-1ec3-4cb8-8a07-e20c31ca87e5"). InnerVolumeSpecName "kube-api-access-drdm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.400422 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6a669571-1ec3-4cb8-8a07-e20c31ca87e5" (UID: "6a669571-1ec3-4cb8-8a07-e20c31ca87e5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.416570 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6a669571-1ec3-4cb8-8a07-e20c31ca87e5" (UID: "6a669571-1ec3-4cb8-8a07-e20c31ca87e5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.448083 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-config-data" (OuterVolumeSpecName: "config-data") pod "6a669571-1ec3-4cb8-8a07-e20c31ca87e5" (UID: "6a669571-1ec3-4cb8-8a07-e20c31ca87e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.475826 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.488600 4845 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.488630 4845 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.488658 4845 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.488668 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drdm6\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-kube-api-access-drdm6\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.488680 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.488688 4845 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.488696 4845 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.488705 4845 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.492167 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-server-conf" (OuterVolumeSpecName: "server-conf") pod "6a669571-1ec3-4cb8-8a07-e20c31ca87e5" (UID: "6a669571-1ec3-4cb8-8a07-e20c31ca87e5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.526167 4845 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.561284 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6a669571-1ec3-4cb8-8a07-e20c31ca87e5" (UID: "6a669571-1ec3-4cb8-8a07-e20c31ca87e5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.589838 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38d9a5cf-6de3-487c-a71c-374ca55ca525-pod-info\") pod \"38d9a5cf-6de3-487c-a71c-374ca55ca525\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.589897 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38d9a5cf-6de3-487c-a71c-374ca55ca525-erlang-cookie-secret\") pod \"38d9a5cf-6de3-487c-a71c-374ca55ca525\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.590034 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-server-conf\") pod \"38d9a5cf-6de3-487c-a71c-374ca55ca525\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.590059 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"38d9a5cf-6de3-487c-a71c-374ca55ca525\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.590082 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-plugins-conf\") pod \"38d9a5cf-6de3-487c-a71c-374ca55ca525\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.590109 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-plugins\") pod \"38d9a5cf-6de3-487c-a71c-374ca55ca525\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.590158 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-tls\") pod \"38d9a5cf-6de3-487c-a71c-374ca55ca525\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.590183 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-erlang-cookie\") pod \"38d9a5cf-6de3-487c-a71c-374ca55ca525\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.590284 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbp69\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-kube-api-access-mbp69\") pod \"38d9a5cf-6de3-487c-a71c-374ca55ca525\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.590337 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-confd\") pod \"38d9a5cf-6de3-487c-a71c-374ca55ca525\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.590360 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-config-data\") pod \"38d9a5cf-6de3-487c-a71c-374ca55ca525\" (UID: \"38d9a5cf-6de3-487c-a71c-374ca55ca525\") " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.591550 4845 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.591573 4845 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.591594 4845 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a669571-1ec3-4cb8-8a07-e20c31ca87e5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.592204 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "38d9a5cf-6de3-487c-a71c-374ca55ca525" (UID: "38d9a5cf-6de3-487c-a71c-374ca55ca525"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.595616 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "38d9a5cf-6de3-487c-a71c-374ca55ca525" (UID: "38d9a5cf-6de3-487c-a71c-374ca55ca525"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.595908 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "38d9a5cf-6de3-487c-a71c-374ca55ca525" (UID: "38d9a5cf-6de3-487c-a71c-374ca55ca525"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.596155 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "38d9a5cf-6de3-487c-a71c-374ca55ca525" (UID: "38d9a5cf-6de3-487c-a71c-374ca55ca525"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.598840 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/38d9a5cf-6de3-487c-a71c-374ca55ca525-pod-info" (OuterVolumeSpecName: "pod-info") pod "38d9a5cf-6de3-487c-a71c-374ca55ca525" (UID: "38d9a5cf-6de3-487c-a71c-374ca55ca525"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.599566 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "38d9a5cf-6de3-487c-a71c-374ca55ca525" (UID: "38d9a5cf-6de3-487c-a71c-374ca55ca525"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.602820 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-kube-api-access-mbp69" (OuterVolumeSpecName: "kube-api-access-mbp69") pod "38d9a5cf-6de3-487c-a71c-374ca55ca525" (UID: "38d9a5cf-6de3-487c-a71c-374ca55ca525"). InnerVolumeSpecName "kube-api-access-mbp69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.605011 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d9a5cf-6de3-487c-a71c-374ca55ca525-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "38d9a5cf-6de3-487c-a71c-374ca55ca525" (UID: "38d9a5cf-6de3-487c-a71c-374ca55ca525"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.615258 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-config-data" (OuterVolumeSpecName: "config-data") pod "38d9a5cf-6de3-487c-a71c-374ca55ca525" (UID: "38d9a5cf-6de3-487c-a71c-374ca55ca525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.651587 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-server-conf" (OuterVolumeSpecName: "server-conf") pod "38d9a5cf-6de3-487c-a71c-374ca55ca525" (UID: "38d9a5cf-6de3-487c-a71c-374ca55ca525"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.693639 4845 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.693695 4845 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.693705 4845 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.693714 4845 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.693722 4845 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.693731 4845 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.693739 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbp69\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-kube-api-access-mbp69\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.693747 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38d9a5cf-6de3-487c-a71c-374ca55ca525-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.693754 4845 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38d9a5cf-6de3-487c-a71c-374ca55ca525-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.693762 4845 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38d9a5cf-6de3-487c-a71c-374ca55ca525-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.701166 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "38d9a5cf-6de3-487c-a71c-374ca55ca525" (UID: "38d9a5cf-6de3-487c-a71c-374ca55ca525"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.719358 4845 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.795450 4845 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:42 crc kubenswrapper[4845]: I1006 07:04:42.795479 4845 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38d9a5cf-6de3-487c-a71c-374ca55ca525-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.216965 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.216996 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.216994 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38d9a5cf-6de3-487c-a71c-374ca55ca525","Type":"ContainerDied","Data":"a93c317988726ae1e9e51b745f4d50bfe2e532e35a9443aada242cd1e959b99c"} Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.217587 4845 scope.go:117] "RemoveContainer" containerID="b03e7f50e171be756612b0954404b099795066c4f9a924a451b446a1704bb5e8" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.258659 4845 scope.go:117] "RemoveContainer" containerID="7822a084cc8e1391a4f88232f9ad984b3d121bb9054ea9593f43c5d0942dc39c" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.271147 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.288824 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.311465 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.328247 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.340818 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 07:04:43 crc kubenswrapper[4845]: E1006 07:04:43.341289 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a669571-1ec3-4cb8-8a07-e20c31ca87e5" containerName="setup-container" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.341313 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a669571-1ec3-4cb8-8a07-e20c31ca87e5" containerName="setup-container" Oct 06 07:04:43 crc kubenswrapper[4845]: E1006 07:04:43.341339 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d9a5cf-6de3-487c-a71c-374ca55ca525" containerName="setup-container" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.341347 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d9a5cf-6de3-487c-a71c-374ca55ca525" containerName="setup-container" Oct 06 07:04:43 crc kubenswrapper[4845]: E1006 07:04:43.341365 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a669571-1ec3-4cb8-8a07-e20c31ca87e5" containerName="rabbitmq" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.341392 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a669571-1ec3-4cb8-8a07-e20c31ca87e5" containerName="rabbitmq" Oct 06 07:04:43 crc kubenswrapper[4845]: E1006 07:04:43.341408 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d9a5cf-6de3-487c-a71c-374ca55ca525" containerName="rabbitmq" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.341416 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d9a5cf-6de3-487c-a71c-374ca55ca525" containerName="rabbitmq" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.341633 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a669571-1ec3-4cb8-8a07-e20c31ca87e5" containerName="rabbitmq" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.341666 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d9a5cf-6de3-487c-a71c-374ca55ca525" containerName="rabbitmq" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.342764 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.345126 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.345563 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.345726 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.345878 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.346007 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-g5scq" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.346028 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.346066 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.356308 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.370675 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.373239 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.379818 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x6g58" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.379986 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.380191 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.381079 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.381136 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.381193 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.381242 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.382021 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.507831 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c295b190-30fe-47c3-ae27-c6b809bbe058-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.507874 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c295b190-30fe-47c3-ae27-c6b809bbe058-config-data\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.507895 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28832f5d-962f-4eef-8903-ab5061b80102-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.507913 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c295b190-30fe-47c3-ae27-c6b809bbe058-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.507934 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/28832f5d-962f-4eef-8903-ab5061b80102-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508047 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c295b190-30fe-47c3-ae27-c6b809bbe058-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508072 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c295b190-30fe-47c3-ae27-c6b809bbe058-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508093 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/28832f5d-962f-4eef-8903-ab5061b80102-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508116 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bngd\" (UniqueName: \"kubernetes.io/projected/28832f5d-962f-4eef-8903-ab5061b80102-kube-api-access-2bngd\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508141 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/28832f5d-962f-4eef-8903-ab5061b80102-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508167 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/28832f5d-962f-4eef-8903-ab5061b80102-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508231 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508265 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkvzg\" (UniqueName: \"kubernetes.io/projected/c295b190-30fe-47c3-ae27-c6b809bbe058-kube-api-access-dkvzg\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508284 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/28832f5d-962f-4eef-8903-ab5061b80102-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508497 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c295b190-30fe-47c3-ae27-c6b809bbe058-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508570 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508603 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/28832f5d-962f-4eef-8903-ab5061b80102-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508625 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c295b190-30fe-47c3-ae27-c6b809bbe058-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508664 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/28832f5d-962f-4eef-8903-ab5061b80102-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508711 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c295b190-30fe-47c3-ae27-c6b809bbe058-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508732 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/28832f5d-962f-4eef-8903-ab5061b80102-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.508750 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c295b190-30fe-47c3-ae27-c6b809bbe058-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611529 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/28832f5d-962f-4eef-8903-ab5061b80102-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611583 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bngd\" (UniqueName: \"kubernetes.io/projected/28832f5d-962f-4eef-8903-ab5061b80102-kube-api-access-2bngd\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611607 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/28832f5d-962f-4eef-8903-ab5061b80102-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611625 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/28832f5d-962f-4eef-8903-ab5061b80102-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611661 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611712 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkvzg\" (UniqueName: \"kubernetes.io/projected/c295b190-30fe-47c3-ae27-c6b809bbe058-kube-api-access-dkvzg\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611741 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/28832f5d-962f-4eef-8903-ab5061b80102-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611816 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c295b190-30fe-47c3-ae27-c6b809bbe058-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611846 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611864 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/28832f5d-962f-4eef-8903-ab5061b80102-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611881 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c295b190-30fe-47c3-ae27-c6b809bbe058-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611908 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/28832f5d-962f-4eef-8903-ab5061b80102-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611927 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/28832f5d-962f-4eef-8903-ab5061b80102-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611943 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c295b190-30fe-47c3-ae27-c6b809bbe058-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611964 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c295b190-30fe-47c3-ae27-c6b809bbe058-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.611994 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c295b190-30fe-47c3-ae27-c6b809bbe058-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.612012 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c295b190-30fe-47c3-ae27-c6b809bbe058-config-data\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.612032 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28832f5d-962f-4eef-8903-ab5061b80102-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.612048 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c295b190-30fe-47c3-ae27-c6b809bbe058-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.612066 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/28832f5d-962f-4eef-8903-ab5061b80102-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.612089 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c295b190-30fe-47c3-ae27-c6b809bbe058-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.612105 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c295b190-30fe-47c3-ae27-c6b809bbe058-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.612588 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c295b190-30fe-47c3-ae27-c6b809bbe058-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.613817 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c295b190-30fe-47c3-ae27-c6b809bbe058-config-data\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.613927 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.613990 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c295b190-30fe-47c3-ae27-c6b809bbe058-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.614008 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28832f5d-962f-4eef-8903-ab5061b80102-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.614025 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.614004 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/28832f5d-962f-4eef-8903-ab5061b80102-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.614294 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c295b190-30fe-47c3-ae27-c6b809bbe058-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.614357 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/28832f5d-962f-4eef-8903-ab5061b80102-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.614362 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c295b190-30fe-47c3-ae27-c6b809bbe058-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.614705 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/28832f5d-962f-4eef-8903-ab5061b80102-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.615023 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/28832f5d-962f-4eef-8903-ab5061b80102-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.618172 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c295b190-30fe-47c3-ae27-c6b809bbe058-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.620139 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c295b190-30fe-47c3-ae27-c6b809bbe058-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.620336 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/28832f5d-962f-4eef-8903-ab5061b80102-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.621300 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/28832f5d-962f-4eef-8903-ab5061b80102-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.622490 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/28832f5d-962f-4eef-8903-ab5061b80102-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.622889 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c295b190-30fe-47c3-ae27-c6b809bbe058-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.623174 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c295b190-30fe-47c3-ae27-c6b809bbe058-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.624989 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/28832f5d-962f-4eef-8903-ab5061b80102-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.637712 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bngd\" (UniqueName: \"kubernetes.io/projected/28832f5d-962f-4eef-8903-ab5061b80102-kube-api-access-2bngd\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.645759 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkvzg\" (UniqueName: \"kubernetes.io/projected/c295b190-30fe-47c3-ae27-c6b809bbe058-kube-api-access-dkvzg\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.653669 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"28832f5d-962f-4eef-8903-ab5061b80102\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.669136 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.669911 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"c295b190-30fe-47c3-ae27-c6b809bbe058\") " pod="openstack/rabbitmq-server-0" Oct 06 07:04:43 crc kubenswrapper[4845]: I1006 07:04:43.694257 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 07:04:44 crc kubenswrapper[4845]: I1006 07:04:44.217901 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 07:04:44 crc kubenswrapper[4845]: I1006 07:04:44.240938 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d9a5cf-6de3-487c-a71c-374ca55ca525" path="/var/lib/kubelet/pods/38d9a5cf-6de3-487c-a71c-374ca55ca525/volumes" Oct 06 07:04:44 crc kubenswrapper[4845]: I1006 07:04:44.241952 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a669571-1ec3-4cb8-8a07-e20c31ca87e5" path="/var/lib/kubelet/pods/6a669571-1ec3-4cb8-8a07-e20c31ca87e5/volumes" Oct 06 07:04:44 crc kubenswrapper[4845]: I1006 07:04:44.243506 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"28832f5d-962f-4eef-8903-ab5061b80102","Type":"ContainerStarted","Data":"fe77f7b5c6a553280f8632340a3e2830840db6f8a30fd5c906308a2d52eaf100"} Oct 06 07:04:44 crc kubenswrapper[4845]: I1006 07:04:44.341079 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.246730 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c295b190-30fe-47c3-ae27-c6b809bbe058","Type":"ContainerStarted","Data":"fabd67fbd898018e343d86d0eb4716667d0d3a086d4f63bca1da82b1a07fa422"} Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.310618 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74d6f9b95f-hmrq5"] Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.314967 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.318049 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.327615 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74d6f9b95f-hmrq5"] Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.450676 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m47gh\" (UniqueName: \"kubernetes.io/projected/04cb8528-d9be-44be-9cf2-70fc0e239a33-kube-api-access-m47gh\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.450757 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-dns-svc\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.451018 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-ovsdbserver-sb\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.451127 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-ovsdbserver-nb\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.451172 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-config\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.451389 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-openstack-edpm-ipam\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.451444 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-dns-swift-storage-0\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.553056 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-openstack-edpm-ipam\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.553116 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-dns-swift-storage-0\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.553148 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m47gh\" (UniqueName: \"kubernetes.io/projected/04cb8528-d9be-44be-9cf2-70fc0e239a33-kube-api-access-m47gh\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.553228 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-dns-svc\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.553337 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-ovsdbserver-sb\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.553407 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-ovsdbserver-nb\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.553435 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-config\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.554164 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-openstack-edpm-ipam\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.554175 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-dns-swift-storage-0\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.554411 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-ovsdbserver-sb\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.554411 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-config\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.555033 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-ovsdbserver-nb\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.555050 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-dns-svc\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.575450 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m47gh\" (UniqueName: \"kubernetes.io/projected/04cb8528-d9be-44be-9cf2-70fc0e239a33-kube-api-access-m47gh\") pod \"dnsmasq-dns-74d6f9b95f-hmrq5\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:45 crc kubenswrapper[4845]: I1006 07:04:45.646226 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:46 crc kubenswrapper[4845]: W1006 07:04:46.101079 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04cb8528_d9be_44be_9cf2_70fc0e239a33.slice/crio-8615e2e961fab1afb60c4a84431ee76ed62c4cd6c8ffdb3ad05ba6626fa46044 WatchSource:0}: Error finding container 8615e2e961fab1afb60c4a84431ee76ed62c4cd6c8ffdb3ad05ba6626fa46044: Status 404 returned error can't find the container with id 8615e2e961fab1afb60c4a84431ee76ed62c4cd6c8ffdb3ad05ba6626fa46044 Oct 06 07:04:46 crc kubenswrapper[4845]: I1006 07:04:46.109948 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74d6f9b95f-hmrq5"] Oct 06 07:04:46 crc kubenswrapper[4845]: I1006 07:04:46.278446 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"28832f5d-962f-4eef-8903-ab5061b80102","Type":"ContainerStarted","Data":"78abf6f4c708a576292b2a592d45105e2650f5288d7e58e13f9a14b0eea5a9e8"} Oct 06 07:04:46 crc kubenswrapper[4845]: I1006 07:04:46.284275 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" event={"ID":"04cb8528-d9be-44be-9cf2-70fc0e239a33","Type":"ContainerStarted","Data":"8615e2e961fab1afb60c4a84431ee76ed62c4cd6c8ffdb3ad05ba6626fa46044"} Oct 06 07:04:46 crc kubenswrapper[4845]: I1006 07:04:46.286649 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c295b190-30fe-47c3-ae27-c6b809bbe058","Type":"ContainerStarted","Data":"f86378516c9c77c760d0afb916d51fd0355fa25f429b0b5b09229cb16a091b9e"} Oct 06 07:04:47 crc kubenswrapper[4845]: I1006 07:04:47.296484 4845 generic.go:334] "Generic (PLEG): container finished" podID="04cb8528-d9be-44be-9cf2-70fc0e239a33" containerID="cef220cfaac095514d0c11524a376c14818ee9d149cca9668fa904acc747a820" exitCode=0 Oct 06 07:04:47 crc kubenswrapper[4845]: I1006 07:04:47.296592 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" event={"ID":"04cb8528-d9be-44be-9cf2-70fc0e239a33","Type":"ContainerDied","Data":"cef220cfaac095514d0c11524a376c14818ee9d149cca9668fa904acc747a820"} Oct 06 07:04:48 crc kubenswrapper[4845]: I1006 07:04:48.309262 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" event={"ID":"04cb8528-d9be-44be-9cf2-70fc0e239a33","Type":"ContainerStarted","Data":"4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b"} Oct 06 07:04:48 crc kubenswrapper[4845]: I1006 07:04:48.309518 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:48 crc kubenswrapper[4845]: I1006 07:04:48.336089 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" podStartSLOduration=3.336072023 podStartE2EDuration="3.336072023s" podCreationTimestamp="2025-10-06 07:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:04:48.328160461 +0000 UTC m=+1172.842901479" watchObservedRunningTime="2025-10-06 07:04:48.336072023 +0000 UTC m=+1172.850813031" Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.648113 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.716395 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fff6d6bd5-92lvg"] Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.719696 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" podUID="b1c731f5-0d76-4221-9d87-67d746bbc8e6" containerName="dnsmasq-dns" containerID="cri-o://b99dfb34d23ee4ad9d1c9c2b078c301abfc5afd758206014ab557f6cbc86d7bd" gracePeriod=10 Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.867644 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55565d6cbc-hlnmz"] Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.869194 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.935136 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55565d6cbc-hlnmz"] Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.945701 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-ovsdbserver-nb\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.945785 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8p4k\" (UniqueName: \"kubernetes.io/projected/f37a2a86-a24c-4fa6-9944-ba02bf209e32-kube-api-access-w8p4k\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.945819 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-ovsdbserver-sb\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.945872 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-config\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.945913 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-dns-svc\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.945966 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-dns-swift-storage-0\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:55 crc kubenswrapper[4845]: I1006 07:04:55.946054 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-openstack-edpm-ipam\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.051365 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-ovsdbserver-nb\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.051439 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8p4k\" (UniqueName: \"kubernetes.io/projected/f37a2a86-a24c-4fa6-9944-ba02bf209e32-kube-api-access-w8p4k\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.051461 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-ovsdbserver-sb\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.051495 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-config\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.051516 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-dns-svc\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.051550 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-dns-swift-storage-0\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.051606 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-openstack-edpm-ipam\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.052683 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-openstack-edpm-ipam\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.058126 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-ovsdbserver-sb\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.060358 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-config\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.060516 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-dns-svc\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.065119 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-ovsdbserver-nb\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.065553 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f37a2a86-a24c-4fa6-9944-ba02bf209e32-dns-swift-storage-0\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.087169 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8p4k\" (UniqueName: \"kubernetes.io/projected/f37a2a86-a24c-4fa6-9944-ba02bf209e32-kube-api-access-w8p4k\") pod \"dnsmasq-dns-55565d6cbc-hlnmz\" (UID: \"f37a2a86-a24c-4fa6-9944-ba02bf209e32\") " pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.252046 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.405237 4845 generic.go:334] "Generic (PLEG): container finished" podID="b1c731f5-0d76-4221-9d87-67d746bbc8e6" containerID="b99dfb34d23ee4ad9d1c9c2b078c301abfc5afd758206014ab557f6cbc86d7bd" exitCode=0 Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.405385 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" event={"ID":"b1c731f5-0d76-4221-9d87-67d746bbc8e6","Type":"ContainerDied","Data":"b99dfb34d23ee4ad9d1c9c2b078c301abfc5afd758206014ab557f6cbc86d7bd"} Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.441659 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.565036 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55565d6cbc-hlnmz"] Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.566686 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgdrq\" (UniqueName: \"kubernetes.io/projected/b1c731f5-0d76-4221-9d87-67d746bbc8e6-kube-api-access-dgdrq\") pod \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.566738 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-dns-svc\") pod \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.566829 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-ovsdbserver-nb\") pod \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.566886 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-ovsdbserver-sb\") pod \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.566937 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-config\") pod \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.567068 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-dns-swift-storage-0\") pod \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\" (UID: \"b1c731f5-0d76-4221-9d87-67d746bbc8e6\") " Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.572966 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1c731f5-0d76-4221-9d87-67d746bbc8e6-kube-api-access-dgdrq" (OuterVolumeSpecName: "kube-api-access-dgdrq") pod "b1c731f5-0d76-4221-9d87-67d746bbc8e6" (UID: "b1c731f5-0d76-4221-9d87-67d746bbc8e6"). InnerVolumeSpecName "kube-api-access-dgdrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:04:56 crc kubenswrapper[4845]: W1006 07:04:56.574040 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf37a2a86_a24c_4fa6_9944_ba02bf209e32.slice/crio-a47319b7effd00835652f4128ef0091d1b3d3a39fcb712ae8bdc883e6c78117c WatchSource:0}: Error finding container a47319b7effd00835652f4128ef0091d1b3d3a39fcb712ae8bdc883e6c78117c: Status 404 returned error can't find the container with id a47319b7effd00835652f4128ef0091d1b3d3a39fcb712ae8bdc883e6c78117c Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.620102 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-config" (OuterVolumeSpecName: "config") pod "b1c731f5-0d76-4221-9d87-67d746bbc8e6" (UID: "b1c731f5-0d76-4221-9d87-67d746bbc8e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.625221 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b1c731f5-0d76-4221-9d87-67d746bbc8e6" (UID: "b1c731f5-0d76-4221-9d87-67d746bbc8e6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.627762 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1c731f5-0d76-4221-9d87-67d746bbc8e6" (UID: "b1c731f5-0d76-4221-9d87-67d746bbc8e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.634718 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b1c731f5-0d76-4221-9d87-67d746bbc8e6" (UID: "b1c731f5-0d76-4221-9d87-67d746bbc8e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.654491 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b1c731f5-0d76-4221-9d87-67d746bbc8e6" (UID: "b1c731f5-0d76-4221-9d87-67d746bbc8e6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.668770 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.668801 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgdrq\" (UniqueName: \"kubernetes.io/projected/b1c731f5-0d76-4221-9d87-67d746bbc8e6-kube-api-access-dgdrq\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.668816 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.668826 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.668835 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:56 crc kubenswrapper[4845]: I1006 07:04:56.668845 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1c731f5-0d76-4221-9d87-67d746bbc8e6-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:04:57 crc kubenswrapper[4845]: I1006 07:04:57.414914 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" event={"ID":"b1c731f5-0d76-4221-9d87-67d746bbc8e6","Type":"ContainerDied","Data":"1ec94e9c5ab2662a86d942d8a59712ee39cc5a28e351bfc2e0012691b16651f6"} Oct 06 07:04:57 crc kubenswrapper[4845]: I1006 07:04:57.415627 4845 scope.go:117] "RemoveContainer" containerID="b99dfb34d23ee4ad9d1c9c2b078c301abfc5afd758206014ab557f6cbc86d7bd" Oct 06 07:04:57 crc kubenswrapper[4845]: I1006 07:04:57.414928 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fff6d6bd5-92lvg" Oct 06 07:04:57 crc kubenswrapper[4845]: I1006 07:04:57.416780 4845 generic.go:334] "Generic (PLEG): container finished" podID="f37a2a86-a24c-4fa6-9944-ba02bf209e32" containerID="6a6aa5e7b808c9a1b808456995cadd4067b072ba96276dff27ad19967ac73526" exitCode=0 Oct 06 07:04:57 crc kubenswrapper[4845]: I1006 07:04:57.416822 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" event={"ID":"f37a2a86-a24c-4fa6-9944-ba02bf209e32","Type":"ContainerDied","Data":"6a6aa5e7b808c9a1b808456995cadd4067b072ba96276dff27ad19967ac73526"} Oct 06 07:04:57 crc kubenswrapper[4845]: I1006 07:04:57.416847 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" event={"ID":"f37a2a86-a24c-4fa6-9944-ba02bf209e32","Type":"ContainerStarted","Data":"a47319b7effd00835652f4128ef0091d1b3d3a39fcb712ae8bdc883e6c78117c"} Oct 06 07:04:57 crc kubenswrapper[4845]: I1006 07:04:57.441765 4845 scope.go:117] "RemoveContainer" containerID="82eb10f2cbbd53166bc7f6c9a3884fa981c94ccd8c3bec1411504d173de1f48c" Oct 06 07:04:57 crc kubenswrapper[4845]: I1006 07:04:57.643919 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fff6d6bd5-92lvg"] Oct 06 07:04:57 crc kubenswrapper[4845]: I1006 07:04:57.656342 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fff6d6bd5-92lvg"] Oct 06 07:04:58 crc kubenswrapper[4845]: I1006 07:04:58.238369 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1c731f5-0d76-4221-9d87-67d746bbc8e6" path="/var/lib/kubelet/pods/b1c731f5-0d76-4221-9d87-67d746bbc8e6/volumes" Oct 06 07:04:58 crc kubenswrapper[4845]: I1006 07:04:58.427825 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" event={"ID":"f37a2a86-a24c-4fa6-9944-ba02bf209e32","Type":"ContainerStarted","Data":"db0ed64a2e3bf2bd607a7e8176e56117020b50e5de772cb838b49b0cf73b85cb"} Oct 06 07:04:58 crc kubenswrapper[4845]: I1006 07:04:58.427980 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:04:58 crc kubenswrapper[4845]: I1006 07:04:58.458350 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" podStartSLOduration=3.458322723 podStartE2EDuration="3.458322723s" podCreationTimestamp="2025-10-06 07:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:04:58.457523572 +0000 UTC m=+1182.972264590" watchObservedRunningTime="2025-10-06 07:04:58.458322723 +0000 UTC m=+1182.973063761" Oct 06 07:05:06 crc kubenswrapper[4845]: I1006 07:05:06.255592 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55565d6cbc-hlnmz" Oct 06 07:05:06 crc kubenswrapper[4845]: I1006 07:05:06.375190 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74d6f9b95f-hmrq5"] Oct 06 07:05:06 crc kubenswrapper[4845]: I1006 07:05:06.376886 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" podUID="04cb8528-d9be-44be-9cf2-70fc0e239a33" containerName="dnsmasq-dns" containerID="cri-o://4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b" gracePeriod=10 Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.059107 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.091013 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-dns-swift-storage-0\") pod \"04cb8528-d9be-44be-9cf2-70fc0e239a33\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.091082 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-dns-svc\") pod \"04cb8528-d9be-44be-9cf2-70fc0e239a33\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.091114 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-ovsdbserver-sb\") pod \"04cb8528-d9be-44be-9cf2-70fc0e239a33\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.091229 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-config\") pod \"04cb8528-d9be-44be-9cf2-70fc0e239a33\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.091279 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m47gh\" (UniqueName: \"kubernetes.io/projected/04cb8528-d9be-44be-9cf2-70fc0e239a33-kube-api-access-m47gh\") pod \"04cb8528-d9be-44be-9cf2-70fc0e239a33\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.091309 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-openstack-edpm-ipam\") pod \"04cb8528-d9be-44be-9cf2-70fc0e239a33\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.091403 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-ovsdbserver-nb\") pod \"04cb8528-d9be-44be-9cf2-70fc0e239a33\" (UID: \"04cb8528-d9be-44be-9cf2-70fc0e239a33\") " Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.100664 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04cb8528-d9be-44be-9cf2-70fc0e239a33-kube-api-access-m47gh" (OuterVolumeSpecName: "kube-api-access-m47gh") pod "04cb8528-d9be-44be-9cf2-70fc0e239a33" (UID: "04cb8528-d9be-44be-9cf2-70fc0e239a33"). InnerVolumeSpecName "kube-api-access-m47gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.164071 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-config" (OuterVolumeSpecName: "config") pod "04cb8528-d9be-44be-9cf2-70fc0e239a33" (UID: "04cb8528-d9be-44be-9cf2-70fc0e239a33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.166090 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04cb8528-d9be-44be-9cf2-70fc0e239a33" (UID: "04cb8528-d9be-44be-9cf2-70fc0e239a33"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.167585 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04cb8528-d9be-44be-9cf2-70fc0e239a33" (UID: "04cb8528-d9be-44be-9cf2-70fc0e239a33"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.180629 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04cb8528-d9be-44be-9cf2-70fc0e239a33" (UID: "04cb8528-d9be-44be-9cf2-70fc0e239a33"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.186199 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "04cb8528-d9be-44be-9cf2-70fc0e239a33" (UID: "04cb8528-d9be-44be-9cf2-70fc0e239a33"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.193447 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.193677 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.193688 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.193700 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m47gh\" (UniqueName: \"kubernetes.io/projected/04cb8528-d9be-44be-9cf2-70fc0e239a33-kube-api-access-m47gh\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.193714 4845 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.193725 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.198621 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04cb8528-d9be-44be-9cf2-70fc0e239a33" (UID: "04cb8528-d9be-44be-9cf2-70fc0e239a33"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.295265 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04cb8528-d9be-44be-9cf2-70fc0e239a33-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.513228 4845 generic.go:334] "Generic (PLEG): container finished" podID="04cb8528-d9be-44be-9cf2-70fc0e239a33" containerID="4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b" exitCode=0 Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.513283 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" event={"ID":"04cb8528-d9be-44be-9cf2-70fc0e239a33","Type":"ContainerDied","Data":"4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b"} Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.513322 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" event={"ID":"04cb8528-d9be-44be-9cf2-70fc0e239a33","Type":"ContainerDied","Data":"8615e2e961fab1afb60c4a84431ee76ed62c4cd6c8ffdb3ad05ba6626fa46044"} Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.513329 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d6f9b95f-hmrq5" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.513341 4845 scope.go:117] "RemoveContainer" containerID="4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.566387 4845 scope.go:117] "RemoveContainer" containerID="cef220cfaac095514d0c11524a376c14818ee9d149cca9668fa904acc747a820" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.573829 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74d6f9b95f-hmrq5"] Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.594901 4845 scope.go:117] "RemoveContainer" containerID="4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b" Oct 06 07:05:07 crc kubenswrapper[4845]: E1006 07:05:07.595321 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b\": container with ID starting with 4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b not found: ID does not exist" containerID="4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.595432 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b"} err="failed to get container status \"4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b\": rpc error: code = NotFound desc = could not find container \"4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b\": container with ID starting with 4c8515bc29649291a8afb21ba76d898dcf18499227f9fe49bb8954b68469a34b not found: ID does not exist" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.595484 4845 scope.go:117] "RemoveContainer" containerID="cef220cfaac095514d0c11524a376c14818ee9d149cca9668fa904acc747a820" Oct 06 07:05:07 crc kubenswrapper[4845]: E1006 07:05:07.595849 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef220cfaac095514d0c11524a376c14818ee9d149cca9668fa904acc747a820\": container with ID starting with cef220cfaac095514d0c11524a376c14818ee9d149cca9668fa904acc747a820 not found: ID does not exist" containerID="cef220cfaac095514d0c11524a376c14818ee9d149cca9668fa904acc747a820" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.595887 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef220cfaac095514d0c11524a376c14818ee9d149cca9668fa904acc747a820"} err="failed to get container status \"cef220cfaac095514d0c11524a376c14818ee9d149cca9668fa904acc747a820\": rpc error: code = NotFound desc = could not find container \"cef220cfaac095514d0c11524a376c14818ee9d149cca9668fa904acc747a820\": container with ID starting with cef220cfaac095514d0c11524a376c14818ee9d149cca9668fa904acc747a820 not found: ID does not exist" Oct 06 07:05:07 crc kubenswrapper[4845]: I1006 07:05:07.598333 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74d6f9b95f-hmrq5"] Oct 06 07:05:08 crc kubenswrapper[4845]: I1006 07:05:08.261475 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04cb8528-d9be-44be-9cf2-70fc0e239a33" path="/var/lib/kubelet/pods/04cb8528-d9be-44be-9cf2-70fc0e239a33/volumes" Oct 06 07:05:18 crc kubenswrapper[4845]: I1006 07:05:18.612407 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"28832f5d-962f-4eef-8903-ab5061b80102","Type":"ContainerDied","Data":"78abf6f4c708a576292b2a592d45105e2650f5288d7e58e13f9a14b0eea5a9e8"} Oct 06 07:05:18 crc kubenswrapper[4845]: I1006 07:05:18.612413 4845 generic.go:334] "Generic (PLEG): container finished" podID="28832f5d-962f-4eef-8903-ab5061b80102" containerID="78abf6f4c708a576292b2a592d45105e2650f5288d7e58e13f9a14b0eea5a9e8" exitCode=0 Oct 06 07:05:18 crc kubenswrapper[4845]: I1006 07:05:18.616010 4845 generic.go:334] "Generic (PLEG): container finished" podID="c295b190-30fe-47c3-ae27-c6b809bbe058" containerID="f86378516c9c77c760d0afb916d51fd0355fa25f429b0b5b09229cb16a091b9e" exitCode=0 Oct 06 07:05:18 crc kubenswrapper[4845]: I1006 07:05:18.616049 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c295b190-30fe-47c3-ae27-c6b809bbe058","Type":"ContainerDied","Data":"f86378516c9c77c760d0afb916d51fd0355fa25f429b0b5b09229cb16a091b9e"} Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.413547 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg"] Oct 06 07:05:19 crc kubenswrapper[4845]: E1006 07:05:19.414344 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c731f5-0d76-4221-9d87-67d746bbc8e6" containerName="init" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.414363 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c731f5-0d76-4221-9d87-67d746bbc8e6" containerName="init" Oct 06 07:05:19 crc kubenswrapper[4845]: E1006 07:05:19.414411 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04cb8528-d9be-44be-9cf2-70fc0e239a33" containerName="dnsmasq-dns" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.414422 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cb8528-d9be-44be-9cf2-70fc0e239a33" containerName="dnsmasq-dns" Oct 06 07:05:19 crc kubenswrapper[4845]: E1006 07:05:19.414458 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04cb8528-d9be-44be-9cf2-70fc0e239a33" containerName="init" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.414467 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cb8528-d9be-44be-9cf2-70fc0e239a33" containerName="init" Oct 06 07:05:19 crc kubenswrapper[4845]: E1006 07:05:19.414479 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c731f5-0d76-4221-9d87-67d746bbc8e6" containerName="dnsmasq-dns" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.414487 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c731f5-0d76-4221-9d87-67d746bbc8e6" containerName="dnsmasq-dns" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.414703 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="04cb8528-d9be-44be-9cf2-70fc0e239a33" containerName="dnsmasq-dns" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.414737 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1c731f5-0d76-4221-9d87-67d746bbc8e6" containerName="dnsmasq-dns" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.415582 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.418018 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.418295 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.419442 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.419674 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.428742 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg"] Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.523805 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.524180 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btq4t\" (UniqueName: \"kubernetes.io/projected/4358b521-4e22-42e2-9844-79612bf845b8-kube-api-access-btq4t\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.524226 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.524265 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.625447 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.625556 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.625619 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btq4t\" (UniqueName: \"kubernetes.io/projected/4358b521-4e22-42e2-9844-79612bf845b8-kube-api-access-btq4t\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.625662 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.627842 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"28832f5d-962f-4eef-8903-ab5061b80102","Type":"ContainerStarted","Data":"ec0ec64a4210323f5ec8a14059cf6eea9730d4bf207874f5e49f90862c0f04c4"} Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.628059 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.633878 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.634089 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.634134 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.634677 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c295b190-30fe-47c3-ae27-c6b809bbe058","Type":"ContainerStarted","Data":"92c7b4fbf98feeea6348789c58cda98dd3255bef88d86fce671a7a22aea83eef"} Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.634902 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.643091 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btq4t\" (UniqueName: \"kubernetes.io/projected/4358b521-4e22-42e2-9844-79612bf845b8-kube-api-access-btq4t\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.659394 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.659361271 podStartE2EDuration="36.659361271s" podCreationTimestamp="2025-10-06 07:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:05:19.653094191 +0000 UTC m=+1204.167835209" watchObservedRunningTime="2025-10-06 07:05:19.659361271 +0000 UTC m=+1204.174102269" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.676850 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.676832737 podStartE2EDuration="36.676832737s" podCreationTimestamp="2025-10-06 07:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:05:19.673841901 +0000 UTC m=+1204.188582939" watchObservedRunningTime="2025-10-06 07:05:19.676832737 +0000 UTC m=+1204.191573745" Oct 06 07:05:19 crc kubenswrapper[4845]: I1006 07:05:19.741001 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:20 crc kubenswrapper[4845]: I1006 07:05:20.327346 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg"] Oct 06 07:05:20 crc kubenswrapper[4845]: I1006 07:05:20.332837 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 07:05:20 crc kubenswrapper[4845]: I1006 07:05:20.643186 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" event={"ID":"4358b521-4e22-42e2-9844-79612bf845b8","Type":"ContainerStarted","Data":"5ffb62e80ed2c7b0c0424427119358b62ca996b7841fdc00bf690e9233f0babf"} Oct 06 07:05:30 crc kubenswrapper[4845]: I1006 07:05:30.748803 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" event={"ID":"4358b521-4e22-42e2-9844-79612bf845b8","Type":"ContainerStarted","Data":"4ea76ca530a5a65bb06acdd8cd4471bad9f7727910af40137a347b5507370a18"} Oct 06 07:05:30 crc kubenswrapper[4845]: I1006 07:05:30.784151 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" podStartSLOduration=2.21423691 podStartE2EDuration="11.784128864s" podCreationTimestamp="2025-10-06 07:05:19 +0000 UTC" firstStartedPulling="2025-10-06 07:05:20.332580531 +0000 UTC m=+1204.847321539" lastFinishedPulling="2025-10-06 07:05:29.902472495 +0000 UTC m=+1214.417213493" observedRunningTime="2025-10-06 07:05:30.763802395 +0000 UTC m=+1215.278543413" watchObservedRunningTime="2025-10-06 07:05:30.784128864 +0000 UTC m=+1215.298869882" Oct 06 07:05:33 crc kubenswrapper[4845]: I1006 07:05:33.672602 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 07:05:33 crc kubenswrapper[4845]: I1006 07:05:33.698562 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 07:05:40 crc kubenswrapper[4845]: I1006 07:05:40.885267 4845 generic.go:334] "Generic (PLEG): container finished" podID="4358b521-4e22-42e2-9844-79612bf845b8" containerID="4ea76ca530a5a65bb06acdd8cd4471bad9f7727910af40137a347b5507370a18" exitCode=0 Oct 06 07:05:40 crc kubenswrapper[4845]: I1006 07:05:40.885363 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" event={"ID":"4358b521-4e22-42e2-9844-79612bf845b8","Type":"ContainerDied","Data":"4ea76ca530a5a65bb06acdd8cd4471bad9f7727910af40137a347b5507370a18"} Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.383553 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.414323 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-repo-setup-combined-ca-bundle\") pod \"4358b521-4e22-42e2-9844-79612bf845b8\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.414882 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-ssh-key\") pod \"4358b521-4e22-42e2-9844-79612bf845b8\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.415080 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btq4t\" (UniqueName: \"kubernetes.io/projected/4358b521-4e22-42e2-9844-79612bf845b8-kube-api-access-btq4t\") pod \"4358b521-4e22-42e2-9844-79612bf845b8\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.415156 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-inventory\") pod \"4358b521-4e22-42e2-9844-79612bf845b8\" (UID: \"4358b521-4e22-42e2-9844-79612bf845b8\") " Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.427869 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4358b521-4e22-42e2-9844-79612bf845b8-kube-api-access-btq4t" (OuterVolumeSpecName: "kube-api-access-btq4t") pod "4358b521-4e22-42e2-9844-79612bf845b8" (UID: "4358b521-4e22-42e2-9844-79612bf845b8"). InnerVolumeSpecName "kube-api-access-btq4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.428327 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4358b521-4e22-42e2-9844-79612bf845b8" (UID: "4358b521-4e22-42e2-9844-79612bf845b8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.469072 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-inventory" (OuterVolumeSpecName: "inventory") pod "4358b521-4e22-42e2-9844-79612bf845b8" (UID: "4358b521-4e22-42e2-9844-79612bf845b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.488193 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4358b521-4e22-42e2-9844-79612bf845b8" (UID: "4358b521-4e22-42e2-9844-79612bf845b8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.517508 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.517541 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btq4t\" (UniqueName: \"kubernetes.io/projected/4358b521-4e22-42e2-9844-79612bf845b8-kube-api-access-btq4t\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.517552 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.517562 4845 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4358b521-4e22-42e2-9844-79612bf845b8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.910949 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" event={"ID":"4358b521-4e22-42e2-9844-79612bf845b8","Type":"ContainerDied","Data":"5ffb62e80ed2c7b0c0424427119358b62ca996b7841fdc00bf690e9233f0babf"} Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.910997 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ffb62e80ed2c7b0c0424427119358b62ca996b7841fdc00bf690e9233f0babf" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.911417 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.972437 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd"] Oct 06 07:05:42 crc kubenswrapper[4845]: E1006 07:05:42.972840 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4358b521-4e22-42e2-9844-79612bf845b8" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.972857 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="4358b521-4e22-42e2-9844-79612bf845b8" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.973019 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="4358b521-4e22-42e2-9844-79612bf845b8" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.973781 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.978204 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.978600 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.979599 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.979902 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:05:42 crc kubenswrapper[4845]: I1006 07:05:42.988296 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd"] Oct 06 07:05:43 crc kubenswrapper[4845]: I1006 07:05:43.036897 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9j26\" (UniqueName: \"kubernetes.io/projected/88581a77-2703-438c-a5d0-e6972d815990-kube-api-access-p9j26\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dvkwd\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:43 crc kubenswrapper[4845]: I1006 07:05:43.036969 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dvkwd\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:43 crc kubenswrapper[4845]: I1006 07:05:43.037127 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dvkwd\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:43 crc kubenswrapper[4845]: I1006 07:05:43.139198 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9j26\" (UniqueName: \"kubernetes.io/projected/88581a77-2703-438c-a5d0-e6972d815990-kube-api-access-p9j26\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dvkwd\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:43 crc kubenswrapper[4845]: I1006 07:05:43.139574 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dvkwd\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:43 crc kubenswrapper[4845]: I1006 07:05:43.139630 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dvkwd\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:43 crc kubenswrapper[4845]: I1006 07:05:43.145457 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dvkwd\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:43 crc kubenswrapper[4845]: I1006 07:05:43.145806 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dvkwd\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:43 crc kubenswrapper[4845]: I1006 07:05:43.156730 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9j26\" (UniqueName: \"kubernetes.io/projected/88581a77-2703-438c-a5d0-e6972d815990-kube-api-access-p9j26\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dvkwd\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:43 crc kubenswrapper[4845]: I1006 07:05:43.291359 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:43 crc kubenswrapper[4845]: I1006 07:05:43.792788 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd"] Oct 06 07:05:43 crc kubenswrapper[4845]: I1006 07:05:43.921423 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" event={"ID":"88581a77-2703-438c-a5d0-e6972d815990","Type":"ContainerStarted","Data":"1edf1f2e4f770a55eb99e9376241c7fc8273cf6611ea5155598d2d65b77aac49"} Oct 06 07:05:44 crc kubenswrapper[4845]: I1006 07:05:44.935358 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" event={"ID":"88581a77-2703-438c-a5d0-e6972d815990","Type":"ContainerStarted","Data":"7c5b09a979102e32e35c4f1512bba6371721fcce40c72a332391d2cd91039cce"} Oct 06 07:05:44 crc kubenswrapper[4845]: I1006 07:05:44.957298 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" podStartSLOduration=2.434888745 podStartE2EDuration="2.957260804s" podCreationTimestamp="2025-10-06 07:05:42 +0000 UTC" firstStartedPulling="2025-10-06 07:05:43.80027422 +0000 UTC m=+1228.315015228" lastFinishedPulling="2025-10-06 07:05:44.322646279 +0000 UTC m=+1228.837387287" observedRunningTime="2025-10-06 07:05:44.952469912 +0000 UTC m=+1229.467210960" watchObservedRunningTime="2025-10-06 07:05:44.957260804 +0000 UTC m=+1229.472001822" Oct 06 07:05:47 crc kubenswrapper[4845]: I1006 07:05:47.973188 4845 generic.go:334] "Generic (PLEG): container finished" podID="88581a77-2703-438c-a5d0-e6972d815990" containerID="7c5b09a979102e32e35c4f1512bba6371721fcce40c72a332391d2cd91039cce" exitCode=0 Oct 06 07:05:47 crc kubenswrapper[4845]: I1006 07:05:47.973275 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" event={"ID":"88581a77-2703-438c-a5d0-e6972d815990","Type":"ContainerDied","Data":"7c5b09a979102e32e35c4f1512bba6371721fcce40c72a332391d2cd91039cce"} Oct 06 07:05:49 crc kubenswrapper[4845]: I1006 07:05:49.483874 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:49 crc kubenswrapper[4845]: I1006 07:05:49.683297 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-ssh-key\") pod \"88581a77-2703-438c-a5d0-e6972d815990\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " Oct 06 07:05:49 crc kubenswrapper[4845]: I1006 07:05:49.683587 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-inventory\") pod \"88581a77-2703-438c-a5d0-e6972d815990\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " Oct 06 07:05:49 crc kubenswrapper[4845]: I1006 07:05:49.683631 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9j26\" (UniqueName: \"kubernetes.io/projected/88581a77-2703-438c-a5d0-e6972d815990-kube-api-access-p9j26\") pod \"88581a77-2703-438c-a5d0-e6972d815990\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " Oct 06 07:05:49 crc kubenswrapper[4845]: I1006 07:05:49.690930 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88581a77-2703-438c-a5d0-e6972d815990-kube-api-access-p9j26" (OuterVolumeSpecName: "kube-api-access-p9j26") pod "88581a77-2703-438c-a5d0-e6972d815990" (UID: "88581a77-2703-438c-a5d0-e6972d815990"). InnerVolumeSpecName "kube-api-access-p9j26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:05:49 crc kubenswrapper[4845]: E1006 07:05:49.709710 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-inventory podName:88581a77-2703-438c-a5d0-e6972d815990 nodeName:}" failed. No retries permitted until 2025-10-06 07:05:50.209677807 +0000 UTC m=+1234.724418815 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-inventory") pod "88581a77-2703-438c-a5d0-e6972d815990" (UID: "88581a77-2703-438c-a5d0-e6972d815990") : error deleting /var/lib/kubelet/pods/88581a77-2703-438c-a5d0-e6972d815990/volume-subpaths: remove /var/lib/kubelet/pods/88581a77-2703-438c-a5d0-e6972d815990/volume-subpaths: no such file or directory Oct 06 07:05:49 crc kubenswrapper[4845]: I1006 07:05:49.712607 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "88581a77-2703-438c-a5d0-e6972d815990" (UID: "88581a77-2703-438c-a5d0-e6972d815990"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:05:49 crc kubenswrapper[4845]: I1006 07:05:49.785686 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9j26\" (UniqueName: \"kubernetes.io/projected/88581a77-2703-438c-a5d0-e6972d815990-kube-api-access-p9j26\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:49 crc kubenswrapper[4845]: I1006 07:05:49.785745 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:49 crc kubenswrapper[4845]: I1006 07:05:49.994782 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" event={"ID":"88581a77-2703-438c-a5d0-e6972d815990","Type":"ContainerDied","Data":"1edf1f2e4f770a55eb99e9376241c7fc8273cf6611ea5155598d2d65b77aac49"} Oct 06 07:05:49 crc kubenswrapper[4845]: I1006 07:05:49.994825 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1edf1f2e4f770a55eb99e9376241c7fc8273cf6611ea5155598d2d65b77aac49" Oct 06 07:05:49 crc kubenswrapper[4845]: I1006 07:05:49.994877 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dvkwd" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.055949 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4"] Oct 06 07:05:50 crc kubenswrapper[4845]: E1006 07:05:50.056316 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88581a77-2703-438c-a5d0-e6972d815990" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.056333 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="88581a77-2703-438c-a5d0-e6972d815990" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.056552 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="88581a77-2703-438c-a5d0-e6972d815990" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.057223 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.071466 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4"] Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.093369 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.093439 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngn9j\" (UniqueName: \"kubernetes.io/projected/5ca7a9ec-05b1-46ae-bc84-065bf4904784-kube-api-access-ngn9j\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.093499 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.093828 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.195727 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.195870 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.195914 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngn9j\" (UniqueName: \"kubernetes.io/projected/5ca7a9ec-05b1-46ae-bc84-065bf4904784-kube-api-access-ngn9j\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.196000 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.200688 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.202801 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.208284 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.219134 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngn9j\" (UniqueName: \"kubernetes.io/projected/5ca7a9ec-05b1-46ae-bc84-065bf4904784-kube-api-access-ngn9j\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.297043 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-inventory\") pod \"88581a77-2703-438c-a5d0-e6972d815990\" (UID: \"88581a77-2703-438c-a5d0-e6972d815990\") " Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.300247 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-inventory" (OuterVolumeSpecName: "inventory") pod "88581a77-2703-438c-a5d0-e6972d815990" (UID: "88581a77-2703-438c-a5d0-e6972d815990"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.398659 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88581a77-2703-438c-a5d0-e6972d815990-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.425512 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:05:50 crc kubenswrapper[4845]: I1006 07:05:50.950611 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4"] Oct 06 07:05:51 crc kubenswrapper[4845]: I1006 07:05:51.003838 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" event={"ID":"5ca7a9ec-05b1-46ae-bc84-065bf4904784","Type":"ContainerStarted","Data":"783a9f483b7f54148f8357e83128c363f32c0eacde067b9649de4d5a7552dd15"} Oct 06 07:05:52 crc kubenswrapper[4845]: I1006 07:05:52.013567 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" event={"ID":"5ca7a9ec-05b1-46ae-bc84-065bf4904784","Type":"ContainerStarted","Data":"1168fd1788d092a019d7435d54a77b4c40613df7a5dc46c7c80e685ed3ccc7fe"} Oct 06 07:05:52 crc kubenswrapper[4845]: I1006 07:05:52.031652 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" podStartSLOduration=1.5709921040000001 podStartE2EDuration="2.031629928s" podCreationTimestamp="2025-10-06 07:05:50 +0000 UTC" firstStartedPulling="2025-10-06 07:05:50.955066756 +0000 UTC m=+1235.469807764" lastFinishedPulling="2025-10-06 07:05:51.41570456 +0000 UTC m=+1235.930445588" observedRunningTime="2025-10-06 07:05:52.028498708 +0000 UTC m=+1236.543239726" watchObservedRunningTime="2025-10-06 07:05:52.031629928 +0000 UTC m=+1236.546370956" Oct 06 07:06:22 crc kubenswrapper[4845]: I1006 07:06:22.085139 4845 scope.go:117] "RemoveContainer" containerID="f6ad3dc98a1120a269ca30b7fe9907210db12e9f1d9664f17aeaddd9e82e75ef" Oct 06 07:06:22 crc kubenswrapper[4845]: I1006 07:06:22.116851 4845 scope.go:117] "RemoveContainer" containerID="abf8df35049529fb330e60e49c4d5127663aa42ce0c131f643d3094bcc0cbeac" Oct 06 07:06:22 crc kubenswrapper[4845]: I1006 07:06:22.136696 4845 scope.go:117] "RemoveContainer" containerID="96190030a88ff80858e707644abc72592e8c49a60ec3b2a2aedfd6d04a82a1df" Oct 06 07:06:23 crc kubenswrapper[4845]: I1006 07:06:23.019629 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:06:23 crc kubenswrapper[4845]: I1006 07:06:23.019908 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:06:53 crc kubenswrapper[4845]: I1006 07:06:53.019443 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:06:53 crc kubenswrapper[4845]: I1006 07:06:53.020003 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:07:22 crc kubenswrapper[4845]: I1006 07:07:22.232427 4845 scope.go:117] "RemoveContainer" containerID="0ee17b80a502005f4a41fe516463e1629e18c66e11e5aa7120fc2e30bdcbc00e" Oct 06 07:07:22 crc kubenswrapper[4845]: I1006 07:07:22.261205 4845 scope.go:117] "RemoveContainer" containerID="3ccae0df7237bcb508d8c80ca8a9308e0b0d830a146e3163605d943100fb0e5f" Oct 06 07:07:22 crc kubenswrapper[4845]: I1006 07:07:22.295450 4845 scope.go:117] "RemoveContainer" containerID="6ca31b3bb3eca616eed65099486e1d33a2363ee9d6718c2929707499cb202e23" Oct 06 07:07:22 crc kubenswrapper[4845]: I1006 07:07:22.354621 4845 scope.go:117] "RemoveContainer" containerID="188c1f227b38688df28380b75a2bfaa3f052af97efb3b86e7ccd34f27a80e993" Oct 06 07:07:23 crc kubenswrapper[4845]: I1006 07:07:23.019230 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:07:23 crc kubenswrapper[4845]: I1006 07:07:23.019300 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:07:23 crc kubenswrapper[4845]: I1006 07:07:23.019353 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 07:07:23 crc kubenswrapper[4845]: I1006 07:07:23.020210 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"747bfea9f095b79ec9738d4756f5ae602d44718e89fd27297d3071705491e54c"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 07:07:23 crc kubenswrapper[4845]: I1006 07:07:23.020284 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://747bfea9f095b79ec9738d4756f5ae602d44718e89fd27297d3071705491e54c" gracePeriod=600 Oct 06 07:07:23 crc kubenswrapper[4845]: I1006 07:07:23.892146 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="747bfea9f095b79ec9738d4756f5ae602d44718e89fd27297d3071705491e54c" exitCode=0 Oct 06 07:07:23 crc kubenswrapper[4845]: I1006 07:07:23.892184 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"747bfea9f095b79ec9738d4756f5ae602d44718e89fd27297d3071705491e54c"} Oct 06 07:07:23 crc kubenswrapper[4845]: I1006 07:07:23.892721 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9"} Oct 06 07:07:23 crc kubenswrapper[4845]: I1006 07:07:23.892753 4845 scope.go:117] "RemoveContainer" containerID="6a4030ef2b48fb5db1ac392bd2dae2cd42e97737e449c0b7f5beb300ab99f64c" Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.146308 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pv7kf"] Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.182688 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.199780 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pv7kf"] Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.387358 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnsrq\" (UniqueName: \"kubernetes.io/projected/b972b9f4-50ac-48f9-a750-16eff3d468d2-kube-api-access-jnsrq\") pod \"certified-operators-pv7kf\" (UID: \"b972b9f4-50ac-48f9-a750-16eff3d468d2\") " pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.387540 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b972b9f4-50ac-48f9-a750-16eff3d468d2-catalog-content\") pod \"certified-operators-pv7kf\" (UID: \"b972b9f4-50ac-48f9-a750-16eff3d468d2\") " pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.387576 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b972b9f4-50ac-48f9-a750-16eff3d468d2-utilities\") pod \"certified-operators-pv7kf\" (UID: \"b972b9f4-50ac-48f9-a750-16eff3d468d2\") " pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.488883 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b972b9f4-50ac-48f9-a750-16eff3d468d2-catalog-content\") pod \"certified-operators-pv7kf\" (UID: \"b972b9f4-50ac-48f9-a750-16eff3d468d2\") " pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.488941 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b972b9f4-50ac-48f9-a750-16eff3d468d2-utilities\") pod \"certified-operators-pv7kf\" (UID: \"b972b9f4-50ac-48f9-a750-16eff3d468d2\") " pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.489084 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnsrq\" (UniqueName: \"kubernetes.io/projected/b972b9f4-50ac-48f9-a750-16eff3d468d2-kube-api-access-jnsrq\") pod \"certified-operators-pv7kf\" (UID: \"b972b9f4-50ac-48f9-a750-16eff3d468d2\") " pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.489454 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b972b9f4-50ac-48f9-a750-16eff3d468d2-catalog-content\") pod \"certified-operators-pv7kf\" (UID: \"b972b9f4-50ac-48f9-a750-16eff3d468d2\") " pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.489464 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b972b9f4-50ac-48f9-a750-16eff3d468d2-utilities\") pod \"certified-operators-pv7kf\" (UID: \"b972b9f4-50ac-48f9-a750-16eff3d468d2\") " pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.511744 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnsrq\" (UniqueName: \"kubernetes.io/projected/b972b9f4-50ac-48f9-a750-16eff3d468d2-kube-api-access-jnsrq\") pod \"certified-operators-pv7kf\" (UID: \"b972b9f4-50ac-48f9-a750-16eff3d468d2\") " pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.515492 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:35 crc kubenswrapper[4845]: I1006 07:08:35.987761 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pv7kf"] Oct 06 07:08:36 crc kubenswrapper[4845]: I1006 07:08:36.577692 4845 generic.go:334] "Generic (PLEG): container finished" podID="b972b9f4-50ac-48f9-a750-16eff3d468d2" containerID="c1d421d9d7e8efd49fcd689550a3c49b2cdaa39b62786dd84f21a9370013e3c2" exitCode=0 Oct 06 07:08:36 crc kubenswrapper[4845]: I1006 07:08:36.577741 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pv7kf" event={"ID":"b972b9f4-50ac-48f9-a750-16eff3d468d2","Type":"ContainerDied","Data":"c1d421d9d7e8efd49fcd689550a3c49b2cdaa39b62786dd84f21a9370013e3c2"} Oct 06 07:08:36 crc kubenswrapper[4845]: I1006 07:08:36.577771 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pv7kf" event={"ID":"b972b9f4-50ac-48f9-a750-16eff3d468d2","Type":"ContainerStarted","Data":"c501c5020f0492a35407c5ea8cccd0c1d381a8402193964717b8697fee8ac770"} Oct 06 07:08:37 crc kubenswrapper[4845]: I1006 07:08:37.588917 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pv7kf" event={"ID":"b972b9f4-50ac-48f9-a750-16eff3d468d2","Type":"ContainerStarted","Data":"426c12116b98f66aee474fdd4ba2a147ae405dd05165b67b0568b3b6b88e0286"} Oct 06 07:08:38 crc kubenswrapper[4845]: I1006 07:08:38.602756 4845 generic.go:334] "Generic (PLEG): container finished" podID="b972b9f4-50ac-48f9-a750-16eff3d468d2" containerID="426c12116b98f66aee474fdd4ba2a147ae405dd05165b67b0568b3b6b88e0286" exitCode=0 Oct 06 07:08:38 crc kubenswrapper[4845]: I1006 07:08:38.602795 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pv7kf" event={"ID":"b972b9f4-50ac-48f9-a750-16eff3d468d2","Type":"ContainerDied","Data":"426c12116b98f66aee474fdd4ba2a147ae405dd05165b67b0568b3b6b88e0286"} Oct 06 07:08:39 crc kubenswrapper[4845]: I1006 07:08:39.612287 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pv7kf" event={"ID":"b972b9f4-50ac-48f9-a750-16eff3d468d2","Type":"ContainerStarted","Data":"794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9"} Oct 06 07:08:39 crc kubenswrapper[4845]: I1006 07:08:39.640288 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pv7kf" podStartSLOduration=2.215195453 podStartE2EDuration="4.640271214s" podCreationTimestamp="2025-10-06 07:08:35 +0000 UTC" firstStartedPulling="2025-10-06 07:08:36.579221003 +0000 UTC m=+1401.093962011" lastFinishedPulling="2025-10-06 07:08:39.004296764 +0000 UTC m=+1403.519037772" observedRunningTime="2025-10-06 07:08:39.635419009 +0000 UTC m=+1404.150160017" watchObservedRunningTime="2025-10-06 07:08:39.640271214 +0000 UTC m=+1404.155012222" Oct 06 07:08:42 crc kubenswrapper[4845]: I1006 07:08:42.640081 4845 generic.go:334] "Generic (PLEG): container finished" podID="5ca7a9ec-05b1-46ae-bc84-065bf4904784" containerID="1168fd1788d092a019d7435d54a77b4c40613df7a5dc46c7c80e685ed3ccc7fe" exitCode=0 Oct 06 07:08:42 crc kubenswrapper[4845]: I1006 07:08:42.640187 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" event={"ID":"5ca7a9ec-05b1-46ae-bc84-065bf4904784","Type":"ContainerDied","Data":"1168fd1788d092a019d7435d54a77b4c40613df7a5dc46c7c80e685ed3ccc7fe"} Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.058494 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.248566 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-bootstrap-combined-ca-bundle\") pod \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.248758 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-ssh-key\") pod \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.248831 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngn9j\" (UniqueName: \"kubernetes.io/projected/5ca7a9ec-05b1-46ae-bc84-065bf4904784-kube-api-access-ngn9j\") pod \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.248862 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-inventory\") pod \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\" (UID: \"5ca7a9ec-05b1-46ae-bc84-065bf4904784\") " Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.254125 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5ca7a9ec-05b1-46ae-bc84-065bf4904784" (UID: "5ca7a9ec-05b1-46ae-bc84-065bf4904784"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.254588 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ca7a9ec-05b1-46ae-bc84-065bf4904784-kube-api-access-ngn9j" (OuterVolumeSpecName: "kube-api-access-ngn9j") pod "5ca7a9ec-05b1-46ae-bc84-065bf4904784" (UID: "5ca7a9ec-05b1-46ae-bc84-065bf4904784"). InnerVolumeSpecName "kube-api-access-ngn9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.282997 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-inventory" (OuterVolumeSpecName: "inventory") pod "5ca7a9ec-05b1-46ae-bc84-065bf4904784" (UID: "5ca7a9ec-05b1-46ae-bc84-065bf4904784"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.283354 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5ca7a9ec-05b1-46ae-bc84-065bf4904784" (UID: "5ca7a9ec-05b1-46ae-bc84-065bf4904784"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.350998 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngn9j\" (UniqueName: \"kubernetes.io/projected/5ca7a9ec-05b1-46ae-bc84-065bf4904784-kube-api-access-ngn9j\") on node \"crc\" DevicePath \"\"" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.351025 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.351036 4845 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.351044 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ca7a9ec-05b1-46ae-bc84-065bf4904784-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.659687 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" event={"ID":"5ca7a9ec-05b1-46ae-bc84-065bf4904784","Type":"ContainerDied","Data":"783a9f483b7f54148f8357e83128c363f32c0eacde067b9649de4d5a7552dd15"} Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.659726 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="783a9f483b7f54148f8357e83128c363f32c0eacde067b9649de4d5a7552dd15" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.659726 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.736759 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj"] Oct 06 07:08:44 crc kubenswrapper[4845]: E1006 07:08:44.737205 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca7a9ec-05b1-46ae-bc84-065bf4904784" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.737227 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca7a9ec-05b1-46ae-bc84-065bf4904784" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.737501 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca7a9ec-05b1-46ae-bc84-065bf4904784" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.738143 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.740507 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.740538 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.740781 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.741260 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.745637 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj"] Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.865424 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5fns\" (UniqueName: \"kubernetes.io/projected/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-kube-api-access-z5fns\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj\" (UID: \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.865515 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj\" (UID: \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.865609 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj\" (UID: \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.967508 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5fns\" (UniqueName: \"kubernetes.io/projected/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-kube-api-access-z5fns\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj\" (UID: \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.967585 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj\" (UID: \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.967652 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj\" (UID: \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.971945 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj\" (UID: \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.978021 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj\" (UID: \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:08:44 crc kubenswrapper[4845]: I1006 07:08:44.983045 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5fns\" (UniqueName: \"kubernetes.io/projected/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-kube-api-access-z5fns\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj\" (UID: \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:08:45 crc kubenswrapper[4845]: I1006 07:08:45.065754 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:08:45 crc kubenswrapper[4845]: I1006 07:08:45.517514 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:45 crc kubenswrapper[4845]: I1006 07:08:45.518884 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:45 crc kubenswrapper[4845]: I1006 07:08:45.568034 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:45 crc kubenswrapper[4845]: I1006 07:08:45.637539 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj"] Oct 06 07:08:45 crc kubenswrapper[4845]: I1006 07:08:45.669661 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" event={"ID":"e0ba0e05-c816-48c7-9d88-a735ea82f3eb","Type":"ContainerStarted","Data":"570d042eff76a1655436c24b4a6d02401fdf50238c12231dd9255b6483351d9a"} Oct 06 07:08:45 crc kubenswrapper[4845]: I1006 07:08:45.713047 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:45 crc kubenswrapper[4845]: I1006 07:08:45.802597 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pv7kf"] Oct 06 07:08:46 crc kubenswrapper[4845]: I1006 07:08:46.682440 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" event={"ID":"e0ba0e05-c816-48c7-9d88-a735ea82f3eb","Type":"ContainerStarted","Data":"07eef51172fb6aa66ba86ed06cf514bf278791b64ac87a245e523adcf964098b"} Oct 06 07:08:46 crc kubenswrapper[4845]: I1006 07:08:46.696922 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" podStartSLOduration=2.266526892 podStartE2EDuration="2.696898721s" podCreationTimestamp="2025-10-06 07:08:44 +0000 UTC" firstStartedPulling="2025-10-06 07:08:45.646914834 +0000 UTC m=+1410.161655842" lastFinishedPulling="2025-10-06 07:08:46.077286663 +0000 UTC m=+1410.592027671" observedRunningTime="2025-10-06 07:08:46.694962041 +0000 UTC m=+1411.209703099" watchObservedRunningTime="2025-10-06 07:08:46.696898721 +0000 UTC m=+1411.211639779" Oct 06 07:08:47 crc kubenswrapper[4845]: I1006 07:08:47.701184 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pv7kf" podUID="b972b9f4-50ac-48f9-a750-16eff3d468d2" containerName="registry-server" containerID="cri-o://794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9" gracePeriod=2 Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.183228 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.280404 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b972b9f4-50ac-48f9-a750-16eff3d468d2-catalog-content\") pod \"b972b9f4-50ac-48f9-a750-16eff3d468d2\" (UID: \"b972b9f4-50ac-48f9-a750-16eff3d468d2\") " Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.280457 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnsrq\" (UniqueName: \"kubernetes.io/projected/b972b9f4-50ac-48f9-a750-16eff3d468d2-kube-api-access-jnsrq\") pod \"b972b9f4-50ac-48f9-a750-16eff3d468d2\" (UID: \"b972b9f4-50ac-48f9-a750-16eff3d468d2\") " Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.280566 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b972b9f4-50ac-48f9-a750-16eff3d468d2-utilities\") pod \"b972b9f4-50ac-48f9-a750-16eff3d468d2\" (UID: \"b972b9f4-50ac-48f9-a750-16eff3d468d2\") " Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.283200 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b972b9f4-50ac-48f9-a750-16eff3d468d2-utilities" (OuterVolumeSpecName: "utilities") pod "b972b9f4-50ac-48f9-a750-16eff3d468d2" (UID: "b972b9f4-50ac-48f9-a750-16eff3d468d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.291647 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b972b9f4-50ac-48f9-a750-16eff3d468d2-kube-api-access-jnsrq" (OuterVolumeSpecName: "kube-api-access-jnsrq") pod "b972b9f4-50ac-48f9-a750-16eff3d468d2" (UID: "b972b9f4-50ac-48f9-a750-16eff3d468d2"). InnerVolumeSpecName "kube-api-access-jnsrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.327312 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b972b9f4-50ac-48f9-a750-16eff3d468d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b972b9f4-50ac-48f9-a750-16eff3d468d2" (UID: "b972b9f4-50ac-48f9-a750-16eff3d468d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.381715 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b972b9f4-50ac-48f9-a750-16eff3d468d2-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.381762 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b972b9f4-50ac-48f9-a750-16eff3d468d2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.381774 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnsrq\" (UniqueName: \"kubernetes.io/projected/b972b9f4-50ac-48f9-a750-16eff3d468d2-kube-api-access-jnsrq\") on node \"crc\" DevicePath \"\"" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.711243 4845 generic.go:334] "Generic (PLEG): container finished" podID="b972b9f4-50ac-48f9-a750-16eff3d468d2" containerID="794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9" exitCode=0 Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.711290 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pv7kf" event={"ID":"b972b9f4-50ac-48f9-a750-16eff3d468d2","Type":"ContainerDied","Data":"794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9"} Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.711517 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pv7kf" event={"ID":"b972b9f4-50ac-48f9-a750-16eff3d468d2","Type":"ContainerDied","Data":"c501c5020f0492a35407c5ea8cccd0c1d381a8402193964717b8697fee8ac770"} Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.711303 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pv7kf" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.711537 4845 scope.go:117] "RemoveContainer" containerID="794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.742820 4845 scope.go:117] "RemoveContainer" containerID="426c12116b98f66aee474fdd4ba2a147ae405dd05165b67b0568b3b6b88e0286" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.745554 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pv7kf"] Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.754529 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pv7kf"] Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.767994 4845 scope.go:117] "RemoveContainer" containerID="c1d421d9d7e8efd49fcd689550a3c49b2cdaa39b62786dd84f21a9370013e3c2" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.822335 4845 scope.go:117] "RemoveContainer" containerID="794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9" Oct 06 07:08:48 crc kubenswrapper[4845]: E1006 07:08:48.822844 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9\": container with ID starting with 794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9 not found: ID does not exist" containerID="794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.822905 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9"} err="failed to get container status \"794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9\": rpc error: code = NotFound desc = could not find container \"794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9\": container with ID starting with 794be9c3addb5b3b5df281dedcec8460e1933c3927ad7dac70be2211bac29ec9 not found: ID does not exist" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.822937 4845 scope.go:117] "RemoveContainer" containerID="426c12116b98f66aee474fdd4ba2a147ae405dd05165b67b0568b3b6b88e0286" Oct 06 07:08:48 crc kubenswrapper[4845]: E1006 07:08:48.823280 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"426c12116b98f66aee474fdd4ba2a147ae405dd05165b67b0568b3b6b88e0286\": container with ID starting with 426c12116b98f66aee474fdd4ba2a147ae405dd05165b67b0568b3b6b88e0286 not found: ID does not exist" containerID="426c12116b98f66aee474fdd4ba2a147ae405dd05165b67b0568b3b6b88e0286" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.823316 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"426c12116b98f66aee474fdd4ba2a147ae405dd05165b67b0568b3b6b88e0286"} err="failed to get container status \"426c12116b98f66aee474fdd4ba2a147ae405dd05165b67b0568b3b6b88e0286\": rpc error: code = NotFound desc = could not find container \"426c12116b98f66aee474fdd4ba2a147ae405dd05165b67b0568b3b6b88e0286\": container with ID starting with 426c12116b98f66aee474fdd4ba2a147ae405dd05165b67b0568b3b6b88e0286 not found: ID does not exist" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.823343 4845 scope.go:117] "RemoveContainer" containerID="c1d421d9d7e8efd49fcd689550a3c49b2cdaa39b62786dd84f21a9370013e3c2" Oct 06 07:08:48 crc kubenswrapper[4845]: E1006 07:08:48.823673 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1d421d9d7e8efd49fcd689550a3c49b2cdaa39b62786dd84f21a9370013e3c2\": container with ID starting with c1d421d9d7e8efd49fcd689550a3c49b2cdaa39b62786dd84f21a9370013e3c2 not found: ID does not exist" containerID="c1d421d9d7e8efd49fcd689550a3c49b2cdaa39b62786dd84f21a9370013e3c2" Oct 06 07:08:48 crc kubenswrapper[4845]: I1006 07:08:48.823712 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1d421d9d7e8efd49fcd689550a3c49b2cdaa39b62786dd84f21a9370013e3c2"} err="failed to get container status \"c1d421d9d7e8efd49fcd689550a3c49b2cdaa39b62786dd84f21a9370013e3c2\": rpc error: code = NotFound desc = could not find container \"c1d421d9d7e8efd49fcd689550a3c49b2cdaa39b62786dd84f21a9370013e3c2\": container with ID starting with c1d421d9d7e8efd49fcd689550a3c49b2cdaa39b62786dd84f21a9370013e3c2 not found: ID does not exist" Oct 06 07:08:50 crc kubenswrapper[4845]: I1006 07:08:50.238665 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b972b9f4-50ac-48f9-a750-16eff3d468d2" path="/var/lib/kubelet/pods/b972b9f4-50ac-48f9-a750-16eff3d468d2/volumes" Oct 06 07:09:22 crc kubenswrapper[4845]: I1006 07:09:22.503513 4845 scope.go:117] "RemoveContainer" containerID="7578f2c015dc605af12885d3129d99a7d626c0827944e79bff977c3fb32fd2e9" Oct 06 07:09:22 crc kubenswrapper[4845]: I1006 07:09:22.535401 4845 scope.go:117] "RemoveContainer" containerID="63c80eddb23dbda3d5ed0d26250eb5ce6f1eddbc0aec31719d0d651951454aeb" Oct 06 07:09:22 crc kubenswrapper[4845]: I1006 07:09:22.557210 4845 scope.go:117] "RemoveContainer" containerID="e272e8a9a1ce4260b16abab94201745db8528ed805fa227d0441aed4998bcfdd" Oct 06 07:09:22 crc kubenswrapper[4845]: I1006 07:09:22.595186 4845 scope.go:117] "RemoveContainer" containerID="6ed8a6cf4c9f639d31c13004eb3e0927b3b839f9d3c44c0beda079768726870a" Oct 06 07:09:23 crc kubenswrapper[4845]: I1006 07:09:23.018808 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:09:23 crc kubenswrapper[4845]: I1006 07:09:23.018874 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.429252 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-glttd"] Oct 06 07:09:42 crc kubenswrapper[4845]: E1006 07:09:42.432492 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b972b9f4-50ac-48f9-a750-16eff3d468d2" containerName="extract-content" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.432586 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b972b9f4-50ac-48f9-a750-16eff3d468d2" containerName="extract-content" Oct 06 07:09:42 crc kubenswrapper[4845]: E1006 07:09:42.432669 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b972b9f4-50ac-48f9-a750-16eff3d468d2" containerName="extract-utilities" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.432731 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b972b9f4-50ac-48f9-a750-16eff3d468d2" containerName="extract-utilities" Oct 06 07:09:42 crc kubenswrapper[4845]: E1006 07:09:42.432792 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b972b9f4-50ac-48f9-a750-16eff3d468d2" containerName="registry-server" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.432846 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b972b9f4-50ac-48f9-a750-16eff3d468d2" containerName="registry-server" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.433142 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="b972b9f4-50ac-48f9-a750-16eff3d468d2" containerName="registry-server" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.435181 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.447555 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-glttd"] Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.626481 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-utilities\") pod \"community-operators-glttd\" (UID: \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\") " pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.627189 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-catalog-content\") pod \"community-operators-glttd\" (UID: \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\") " pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.627410 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6lpb\" (UniqueName: \"kubernetes.io/projected/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-kube-api-access-l6lpb\") pod \"community-operators-glttd\" (UID: \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\") " pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.730766 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6lpb\" (UniqueName: \"kubernetes.io/projected/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-kube-api-access-l6lpb\") pod \"community-operators-glttd\" (UID: \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\") " pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.730876 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-utilities\") pod \"community-operators-glttd\" (UID: \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\") " pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.730911 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-catalog-content\") pod \"community-operators-glttd\" (UID: \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\") " pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.731504 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-catalog-content\") pod \"community-operators-glttd\" (UID: \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\") " pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.731735 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-utilities\") pod \"community-operators-glttd\" (UID: \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\") " pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.749659 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6lpb\" (UniqueName: \"kubernetes.io/projected/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-kube-api-access-l6lpb\") pod \"community-operators-glttd\" (UID: \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\") " pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:42 crc kubenswrapper[4845]: I1006 07:09:42.763448 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:43 crc kubenswrapper[4845]: I1006 07:09:43.330205 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-glttd"] Oct 06 07:09:44 crc kubenswrapper[4845]: I1006 07:09:44.258892 4845 generic.go:334] "Generic (PLEG): container finished" podID="c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" containerID="ab88d130f0b17e0c146f4dbf6b13cd291ce95686772ca03a9554bb55a0ae64cf" exitCode=0 Oct 06 07:09:44 crc kubenswrapper[4845]: I1006 07:09:44.258952 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glttd" event={"ID":"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5","Type":"ContainerDied","Data":"ab88d130f0b17e0c146f4dbf6b13cd291ce95686772ca03a9554bb55a0ae64cf"} Oct 06 07:09:44 crc kubenswrapper[4845]: I1006 07:09:44.259187 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glttd" event={"ID":"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5","Type":"ContainerStarted","Data":"4ffae4a002cea071e069f6d3edd560cadc32967f190c09aa47246dee60044260"} Oct 06 07:09:45 crc kubenswrapper[4845]: I1006 07:09:45.272810 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glttd" event={"ID":"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5","Type":"ContainerStarted","Data":"282729ede17b0649f30db8b1a67f0ead10ffaaf36056fbef7a88ed3321334e6e"} Oct 06 07:09:46 crc kubenswrapper[4845]: I1006 07:09:46.282295 4845 generic.go:334] "Generic (PLEG): container finished" podID="c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" containerID="282729ede17b0649f30db8b1a67f0ead10ffaaf36056fbef7a88ed3321334e6e" exitCode=0 Oct 06 07:09:46 crc kubenswrapper[4845]: I1006 07:09:46.282397 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glttd" event={"ID":"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5","Type":"ContainerDied","Data":"282729ede17b0649f30db8b1a67f0ead10ffaaf36056fbef7a88ed3321334e6e"} Oct 06 07:09:47 crc kubenswrapper[4845]: I1006 07:09:47.293782 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glttd" event={"ID":"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5","Type":"ContainerStarted","Data":"2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4"} Oct 06 07:09:47 crc kubenswrapper[4845]: I1006 07:09:47.321804 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-glttd" podStartSLOduration=2.923764592 podStartE2EDuration="5.321787971s" podCreationTimestamp="2025-10-06 07:09:42 +0000 UTC" firstStartedPulling="2025-10-06 07:09:44.26049522 +0000 UTC m=+1468.775236228" lastFinishedPulling="2025-10-06 07:09:46.658518599 +0000 UTC m=+1471.173259607" observedRunningTime="2025-10-06 07:09:47.312834003 +0000 UTC m=+1471.827575061" watchObservedRunningTime="2025-10-06 07:09:47.321787971 +0000 UTC m=+1471.836528979" Oct 06 07:09:52 crc kubenswrapper[4845]: I1006 07:09:52.764622 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:52 crc kubenswrapper[4845]: I1006 07:09:52.765188 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:52 crc kubenswrapper[4845]: I1006 07:09:52.822334 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:53 crc kubenswrapper[4845]: I1006 07:09:53.019135 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:09:53 crc kubenswrapper[4845]: I1006 07:09:53.019190 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:09:53 crc kubenswrapper[4845]: I1006 07:09:53.433898 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:53 crc kubenswrapper[4845]: I1006 07:09:53.528511 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-glttd"] Oct 06 07:09:55 crc kubenswrapper[4845]: I1006 07:09:55.364204 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-glttd" podUID="c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" containerName="registry-server" containerID="cri-o://2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4" gracePeriod=2 Oct 06 07:09:55 crc kubenswrapper[4845]: I1006 07:09:55.803812 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:55 crc kubenswrapper[4845]: I1006 07:09:55.920116 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-utilities\") pod \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\" (UID: \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\") " Oct 06 07:09:55 crc kubenswrapper[4845]: I1006 07:09:55.920459 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-catalog-content\") pod \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\" (UID: \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\") " Oct 06 07:09:55 crc kubenswrapper[4845]: I1006 07:09:55.920658 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6lpb\" (UniqueName: \"kubernetes.io/projected/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-kube-api-access-l6lpb\") pod \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\" (UID: \"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5\") " Oct 06 07:09:55 crc kubenswrapper[4845]: I1006 07:09:55.921141 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-utilities" (OuterVolumeSpecName: "utilities") pod "c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" (UID: "c3229aa3-4bcd-42d2-be10-3d8157e4d7f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:09:55 crc kubenswrapper[4845]: I1006 07:09:55.921562 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:09:55 crc kubenswrapper[4845]: I1006 07:09:55.926058 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-kube-api-access-l6lpb" (OuterVolumeSpecName: "kube-api-access-l6lpb") pod "c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" (UID: "c3229aa3-4bcd-42d2-be10-3d8157e4d7f5"). InnerVolumeSpecName "kube-api-access-l6lpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:09:55 crc kubenswrapper[4845]: I1006 07:09:55.975894 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" (UID: "c3229aa3-4bcd-42d2-be10-3d8157e4d7f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.023759 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6lpb\" (UniqueName: \"kubernetes.io/projected/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-kube-api-access-l6lpb\") on node \"crc\" DevicePath \"\"" Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.024000 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.374802 4845 generic.go:334] "Generic (PLEG): container finished" podID="c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" containerID="2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4" exitCode=0 Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.374855 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glttd" event={"ID":"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5","Type":"ContainerDied","Data":"2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4"} Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.374906 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-glttd" event={"ID":"c3229aa3-4bcd-42d2-be10-3d8157e4d7f5","Type":"ContainerDied","Data":"4ffae4a002cea071e069f6d3edd560cadc32967f190c09aa47246dee60044260"} Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.374923 4845 scope.go:117] "RemoveContainer" containerID="2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4" Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.374919 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-glttd" Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.394484 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-glttd"] Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.399693 4845 scope.go:117] "RemoveContainer" containerID="282729ede17b0649f30db8b1a67f0ead10ffaaf36056fbef7a88ed3321334e6e" Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.402343 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-glttd"] Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.422176 4845 scope.go:117] "RemoveContainer" containerID="ab88d130f0b17e0c146f4dbf6b13cd291ce95686772ca03a9554bb55a0ae64cf" Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.477932 4845 scope.go:117] "RemoveContainer" containerID="2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4" Oct 06 07:09:56 crc kubenswrapper[4845]: E1006 07:09:56.478525 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4\": container with ID starting with 2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4 not found: ID does not exist" containerID="2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4" Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.478584 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4"} err="failed to get container status \"2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4\": rpc error: code = NotFound desc = could not find container \"2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4\": container with ID starting with 2de4c78fdf9a13555f074bc56e669b0f832304fbb1e19d187c1078f38f9f64d4 not found: ID does not exist" Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.478623 4845 scope.go:117] "RemoveContainer" containerID="282729ede17b0649f30db8b1a67f0ead10ffaaf36056fbef7a88ed3321334e6e" Oct 06 07:09:56 crc kubenswrapper[4845]: E1006 07:09:56.479060 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"282729ede17b0649f30db8b1a67f0ead10ffaaf36056fbef7a88ed3321334e6e\": container with ID starting with 282729ede17b0649f30db8b1a67f0ead10ffaaf36056fbef7a88ed3321334e6e not found: ID does not exist" containerID="282729ede17b0649f30db8b1a67f0ead10ffaaf36056fbef7a88ed3321334e6e" Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.479091 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"282729ede17b0649f30db8b1a67f0ead10ffaaf36056fbef7a88ed3321334e6e"} err="failed to get container status \"282729ede17b0649f30db8b1a67f0ead10ffaaf36056fbef7a88ed3321334e6e\": rpc error: code = NotFound desc = could not find container \"282729ede17b0649f30db8b1a67f0ead10ffaaf36056fbef7a88ed3321334e6e\": container with ID starting with 282729ede17b0649f30db8b1a67f0ead10ffaaf36056fbef7a88ed3321334e6e not found: ID does not exist" Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.479115 4845 scope.go:117] "RemoveContainer" containerID="ab88d130f0b17e0c146f4dbf6b13cd291ce95686772ca03a9554bb55a0ae64cf" Oct 06 07:09:56 crc kubenswrapper[4845]: E1006 07:09:56.479456 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab88d130f0b17e0c146f4dbf6b13cd291ce95686772ca03a9554bb55a0ae64cf\": container with ID starting with ab88d130f0b17e0c146f4dbf6b13cd291ce95686772ca03a9554bb55a0ae64cf not found: ID does not exist" containerID="ab88d130f0b17e0c146f4dbf6b13cd291ce95686772ca03a9554bb55a0ae64cf" Oct 06 07:09:56 crc kubenswrapper[4845]: I1006 07:09:56.479497 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab88d130f0b17e0c146f4dbf6b13cd291ce95686772ca03a9554bb55a0ae64cf"} err="failed to get container status \"ab88d130f0b17e0c146f4dbf6b13cd291ce95686772ca03a9554bb55a0ae64cf\": rpc error: code = NotFound desc = could not find container \"ab88d130f0b17e0c146f4dbf6b13cd291ce95686772ca03a9554bb55a0ae64cf\": container with ID starting with ab88d130f0b17e0c146f4dbf6b13cd291ce95686772ca03a9554bb55a0ae64cf not found: ID does not exist" Oct 06 07:09:58 crc kubenswrapper[4845]: I1006 07:09:58.249952 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" path="/var/lib/kubelet/pods/c3229aa3-4bcd-42d2-be10-3d8157e4d7f5/volumes" Oct 06 07:10:18 crc kubenswrapper[4845]: I1006 07:10:18.054699 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ksn8k"] Oct 06 07:10:18 crc kubenswrapper[4845]: I1006 07:10:18.065439 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ksn8k"] Oct 06 07:10:18 crc kubenswrapper[4845]: I1006 07:10:18.240464 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31afef42-8484-46bf-ab3e-be1b8eb4dd4a" path="/var/lib/kubelet/pods/31afef42-8484-46bf-ab3e-be1b8eb4dd4a/volumes" Oct 06 07:10:22 crc kubenswrapper[4845]: I1006 07:10:22.693689 4845 scope.go:117] "RemoveContainer" containerID="69fd1df320dce22c38062a718f785d3ff4f405588ba34bec3b404a1b9b8d147c" Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.019091 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.019177 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.019236 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.020177 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.020265 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" gracePeriod=600 Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.042474 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kf6m4"] Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.055068 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-w87zr"] Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.067246 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-w87zr"] Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.079511 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kf6m4"] Oct 06 07:10:23 crc kubenswrapper[4845]: E1006 07:10:23.152692 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.674191 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" exitCode=0 Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.674250 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9"} Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.674323 4845 scope.go:117] "RemoveContainer" containerID="747bfea9f095b79ec9738d4756f5ae602d44718e89fd27297d3071705491e54c" Oct 06 07:10:23 crc kubenswrapper[4845]: I1006 07:10:23.674957 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:10:23 crc kubenswrapper[4845]: E1006 07:10:23.675272 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:10:24 crc kubenswrapper[4845]: I1006 07:10:24.248074 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dbb4b1f-cb80-4588-9871-c7df1d082077" path="/var/lib/kubelet/pods/1dbb4b1f-cb80-4588-9871-c7df1d082077/volumes" Oct 06 07:10:24 crc kubenswrapper[4845]: I1006 07:10:24.248799 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59bbac9c-a7a6-485d-b796-ca047c8391c7" path="/var/lib/kubelet/pods/59bbac9c-a7a6-485d-b796-ca047c8391c7/volumes" Oct 06 07:10:27 crc kubenswrapper[4845]: I1006 07:10:27.032662 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9389-account-create-rzx65"] Oct 06 07:10:27 crc kubenswrapper[4845]: I1006 07:10:27.040310 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9389-account-create-rzx65"] Oct 06 07:10:28 crc kubenswrapper[4845]: I1006 07:10:28.238686 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="641d0255-700b-41be-899e-e23f2129be3a" path="/var/lib/kubelet/pods/641d0255-700b-41be-899e-e23f2129be3a/volumes" Oct 06 07:10:32 crc kubenswrapper[4845]: I1006 07:10:32.753637 4845 generic.go:334] "Generic (PLEG): container finished" podID="e0ba0e05-c816-48c7-9d88-a735ea82f3eb" containerID="07eef51172fb6aa66ba86ed06cf514bf278791b64ac87a245e523adcf964098b" exitCode=0 Oct 06 07:10:32 crc kubenswrapper[4845]: I1006 07:10:32.753711 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" event={"ID":"e0ba0e05-c816-48c7-9d88-a735ea82f3eb","Type":"ContainerDied","Data":"07eef51172fb6aa66ba86ed06cf514bf278791b64ac87a245e523adcf964098b"} Oct 06 07:10:33 crc kubenswrapper[4845]: I1006 07:10:33.028515 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0f95-account-create-59gks"] Oct 06 07:10:33 crc kubenswrapper[4845]: I1006 07:10:33.045054 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a34d-account-create-d27s2"] Oct 06 07:10:33 crc kubenswrapper[4845]: I1006 07:10:33.052886 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0f95-account-create-59gks"] Oct 06 07:10:33 crc kubenswrapper[4845]: I1006 07:10:33.062734 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a34d-account-create-d27s2"] Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.176038 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.237258 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59462ba4-763d-48f0-8d97-98113c788102" path="/var/lib/kubelet/pods/59462ba4-763d-48f0-8d97-98113c788102/volumes" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.237787 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3f5dd8-4bb8-4695-bd9b-5363016555f7" path="/var/lib/kubelet/pods/ce3f5dd8-4bb8-4695-bd9b-5363016555f7/volumes" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.242920 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5fns\" (UniqueName: \"kubernetes.io/projected/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-kube-api-access-z5fns\") pod \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\" (UID: \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\") " Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.243143 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-ssh-key\") pod \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\" (UID: \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\") " Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.243256 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-inventory\") pod \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\" (UID: \"e0ba0e05-c816-48c7-9d88-a735ea82f3eb\") " Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.265642 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-kube-api-access-z5fns" (OuterVolumeSpecName: "kube-api-access-z5fns") pod "e0ba0e05-c816-48c7-9d88-a735ea82f3eb" (UID: "e0ba0e05-c816-48c7-9d88-a735ea82f3eb"). InnerVolumeSpecName "kube-api-access-z5fns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.275023 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-inventory" (OuterVolumeSpecName: "inventory") pod "e0ba0e05-c816-48c7-9d88-a735ea82f3eb" (UID: "e0ba0e05-c816-48c7-9d88-a735ea82f3eb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.290223 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e0ba0e05-c816-48c7-9d88-a735ea82f3eb" (UID: "e0ba0e05-c816-48c7-9d88-a735ea82f3eb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.345118 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.345149 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5fns\" (UniqueName: \"kubernetes.io/projected/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-kube-api-access-z5fns\") on node \"crc\" DevicePath \"\"" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.345160 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0ba0e05-c816-48c7-9d88-a735ea82f3eb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.772391 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.772359 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj" event={"ID":"e0ba0e05-c816-48c7-9d88-a735ea82f3eb","Type":"ContainerDied","Data":"570d042eff76a1655436c24b4a6d02401fdf50238c12231dd9255b6483351d9a"} Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.772513 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="570d042eff76a1655436c24b4a6d02401fdf50238c12231dd9255b6483351d9a" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.864429 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9"] Oct 06 07:10:34 crc kubenswrapper[4845]: E1006 07:10:34.864903 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" containerName="extract-utilities" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.864927 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" containerName="extract-utilities" Oct 06 07:10:34 crc kubenswrapper[4845]: E1006 07:10:34.864946 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" containerName="extract-content" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.864954 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" containerName="extract-content" Oct 06 07:10:34 crc kubenswrapper[4845]: E1006 07:10:34.864992 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" containerName="registry-server" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.865000 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" containerName="registry-server" Oct 06 07:10:34 crc kubenswrapper[4845]: E1006 07:10:34.865023 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ba0e05-c816-48c7-9d88-a735ea82f3eb" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.865032 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ba0e05-c816-48c7-9d88-a735ea82f3eb" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.865253 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ba0e05-c816-48c7-9d88-a735ea82f3eb" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.865278 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3229aa3-4bcd-42d2-be10-3d8157e4d7f5" containerName="registry-server" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.866123 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.867616 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.868085 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.868156 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.869550 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.877181 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9"] Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.954117 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9\" (UID: \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.954251 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9\" (UID: \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:10:34 crc kubenswrapper[4845]: I1006 07:10:34.954325 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkhwd\" (UniqueName: \"kubernetes.io/projected/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-kube-api-access-fkhwd\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9\" (UID: \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:10:35 crc kubenswrapper[4845]: I1006 07:10:35.056104 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkhwd\" (UniqueName: \"kubernetes.io/projected/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-kube-api-access-fkhwd\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9\" (UID: \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:10:35 crc kubenswrapper[4845]: I1006 07:10:35.056179 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9\" (UID: \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:10:35 crc kubenswrapper[4845]: I1006 07:10:35.056323 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9\" (UID: \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:10:35 crc kubenswrapper[4845]: I1006 07:10:35.060165 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9\" (UID: \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:10:35 crc kubenswrapper[4845]: I1006 07:10:35.060542 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9\" (UID: \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:10:35 crc kubenswrapper[4845]: I1006 07:10:35.076278 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkhwd\" (UniqueName: \"kubernetes.io/projected/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-kube-api-access-fkhwd\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9\" (UID: \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:10:35 crc kubenswrapper[4845]: I1006 07:10:35.191566 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:10:35 crc kubenswrapper[4845]: I1006 07:10:35.659419 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9"] Oct 06 07:10:35 crc kubenswrapper[4845]: I1006 07:10:35.670526 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 07:10:35 crc kubenswrapper[4845]: I1006 07:10:35.781234 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" event={"ID":"4e569fb2-9612-43bd-93ab-bfad8fc42c9c","Type":"ContainerStarted","Data":"be73590acaa9c19879d29728ca319728cb4419331fe6f7fb03de5880d8235386"} Oct 06 07:10:36 crc kubenswrapper[4845]: I1006 07:10:36.790061 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" event={"ID":"4e569fb2-9612-43bd-93ab-bfad8fc42c9c","Type":"ContainerStarted","Data":"7d7b00d1952bf0906015941de28096734dd70057e62a67473920f6c2451b5e30"} Oct 06 07:10:36 crc kubenswrapper[4845]: I1006 07:10:36.808729 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" podStartSLOduration=1.984611881 podStartE2EDuration="2.808709095s" podCreationTimestamp="2025-10-06 07:10:34 +0000 UTC" firstStartedPulling="2025-10-06 07:10:35.670310322 +0000 UTC m=+1520.185051330" lastFinishedPulling="2025-10-06 07:10:36.494407536 +0000 UTC m=+1521.009148544" observedRunningTime="2025-10-06 07:10:36.802926857 +0000 UTC m=+1521.317667875" watchObservedRunningTime="2025-10-06 07:10:36.808709095 +0000 UTC m=+1521.323450093" Oct 06 07:10:38 crc kubenswrapper[4845]: I1006 07:10:38.227757 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:10:38 crc kubenswrapper[4845]: E1006 07:10:38.228242 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:10:51 crc kubenswrapper[4845]: I1006 07:10:51.043652 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-vcfff"] Oct 06 07:10:51 crc kubenswrapper[4845]: I1006 07:10:51.051016 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-vwhfh"] Oct 06 07:10:51 crc kubenswrapper[4845]: I1006 07:10:51.060139 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-bzr5z"] Oct 06 07:10:51 crc kubenswrapper[4845]: I1006 07:10:51.066829 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-bzr5z"] Oct 06 07:10:51 crc kubenswrapper[4845]: I1006 07:10:51.073113 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-vwhfh"] Oct 06 07:10:51 crc kubenswrapper[4845]: I1006 07:10:51.079145 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-vcfff"] Oct 06 07:10:52 crc kubenswrapper[4845]: I1006 07:10:52.226550 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:10:52 crc kubenswrapper[4845]: E1006 07:10:52.227088 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:10:52 crc kubenswrapper[4845]: I1006 07:10:52.238115 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d918037-6a73-4a99-9e69-28495f63f3be" path="/var/lib/kubelet/pods/6d918037-6a73-4a99-9e69-28495f63f3be/volumes" Oct 06 07:10:52 crc kubenswrapper[4845]: I1006 07:10:52.238662 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a384bf06-7d5c-4189-88ca-67302613c968" path="/var/lib/kubelet/pods/a384bf06-7d5c-4189-88ca-67302613c968/volumes" Oct 06 07:10:52 crc kubenswrapper[4845]: I1006 07:10:52.239171 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c59096b3-228f-421b-9e87-ad0dd90c1ab3" path="/var/lib/kubelet/pods/c59096b3-228f-421b-9e87-ad0dd90c1ab3/volumes" Oct 06 07:10:57 crc kubenswrapper[4845]: I1006 07:10:57.029492 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-qnzpb"] Oct 06 07:10:57 crc kubenswrapper[4845]: I1006 07:10:57.056194 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-qnzpb"] Oct 06 07:10:58 crc kubenswrapper[4845]: I1006 07:10:58.239687 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef9aaef1-2966-4818-9f26-87bcf275907d" path="/var/lib/kubelet/pods/ef9aaef1-2966-4818-9f26-87bcf275907d/volumes" Oct 06 07:11:00 crc kubenswrapper[4845]: I1006 07:11:00.039110 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-da3d-account-create-cw7vt"] Oct 06 07:11:00 crc kubenswrapper[4845]: I1006 07:11:00.081039 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-da3d-account-create-cw7vt"] Oct 06 07:11:00 crc kubenswrapper[4845]: I1006 07:11:00.237386 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0b56390-32a4-4314-b26f-9e818676dde8" path="/var/lib/kubelet/pods/c0b56390-32a4-4314-b26f-9e818676dde8/volumes" Oct 06 07:11:01 crc kubenswrapper[4845]: I1006 07:11:01.025126 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jnr5f"] Oct 06 07:11:01 crc kubenswrapper[4845]: I1006 07:11:01.033564 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jnr5f"] Oct 06 07:11:02 crc kubenswrapper[4845]: I1006 07:11:02.244095 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c86b89-d988-4e25-8019-77e7e266f785" path="/var/lib/kubelet/pods/29c86b89-d988-4e25-8019-77e7e266f785/volumes" Oct 06 07:11:05 crc kubenswrapper[4845]: I1006 07:11:05.227518 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:11:05 crc kubenswrapper[4845]: E1006 07:11:05.228610 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:11:16 crc kubenswrapper[4845]: I1006 07:11:16.040235 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a311-account-create-nqw7t"] Oct 06 07:11:16 crc kubenswrapper[4845]: I1006 07:11:16.051057 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ec2e-account-create-sxrxb"] Oct 06 07:11:16 crc kubenswrapper[4845]: I1006 07:11:16.058801 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a311-account-create-nqw7t"] Oct 06 07:11:16 crc kubenswrapper[4845]: I1006 07:11:16.065225 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ec2e-account-create-sxrxb"] Oct 06 07:11:16 crc kubenswrapper[4845]: I1006 07:11:16.235136 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:11:16 crc kubenswrapper[4845]: E1006 07:11:16.235854 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:11:16 crc kubenswrapper[4845]: I1006 07:11:16.241458 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f7cd79-f2bf-4927-9967-a6f5ccd42bdf" path="/var/lib/kubelet/pods/65f7cd79-f2bf-4927-9967-a6f5ccd42bdf/volumes" Oct 06 07:11:16 crc kubenswrapper[4845]: I1006 07:11:16.242127 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27" path="/var/lib/kubelet/pods/7594b2d3-25d9-41e7-8b9d-d3c7ad1c3d27/volumes" Oct 06 07:11:18 crc kubenswrapper[4845]: I1006 07:11:18.034593 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-rdfkv"] Oct 06 07:11:18 crc kubenswrapper[4845]: I1006 07:11:18.042576 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-rdfkv"] Oct 06 07:11:18 crc kubenswrapper[4845]: I1006 07:11:18.246206 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe" path="/var/lib/kubelet/pods/9f5c42ed-45a5-40fe-8e8e-aaa105c2ffbe/volumes" Oct 06 07:11:22 crc kubenswrapper[4845]: I1006 07:11:22.028539 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rd54p"] Oct 06 07:11:22 crc kubenswrapper[4845]: I1006 07:11:22.036725 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rd54p"] Oct 06 07:11:22 crc kubenswrapper[4845]: I1006 07:11:22.259574 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b91871bf-75cd-40c0-aa78-a4fd53ff54dc" path="/var/lib/kubelet/pods/b91871bf-75cd-40c0-aa78-a4fd53ff54dc/volumes" Oct 06 07:11:22 crc kubenswrapper[4845]: I1006 07:11:22.811771 4845 scope.go:117] "RemoveContainer" containerID="c29c806e8664e06bfdf7e599cb473d52132b1a59dca59532453e6145130f7d3c" Oct 06 07:11:22 crc kubenswrapper[4845]: I1006 07:11:22.835575 4845 scope.go:117] "RemoveContainer" containerID="de3986ecf7ebfda08bc8723317f9b9ac87dd96278e9df91892d3b3294d4c3029" Oct 06 07:11:22 crc kubenswrapper[4845]: I1006 07:11:22.876566 4845 scope.go:117] "RemoveContainer" containerID="0ef35a947d4bfaa4d135fab464751529dde2deaf8189b41ca5b6542a7b750237" Oct 06 07:11:22 crc kubenswrapper[4845]: I1006 07:11:22.927386 4845 scope.go:117] "RemoveContainer" containerID="08b5e9c4a09c775e386026fc80e8f24023c6f116024871d0cd0e5491281761b8" Oct 06 07:11:22 crc kubenswrapper[4845]: I1006 07:11:22.966009 4845 scope.go:117] "RemoveContainer" containerID="44933b23a87e922443e622c8a72c3d3b56c3ac3b128ee3ae85a8ae43c1106d36" Oct 06 07:11:23 crc kubenswrapper[4845]: I1006 07:11:23.011746 4845 scope.go:117] "RemoveContainer" containerID="6e1ba7ad92ce6aae9c596c3573f4a6baf356f82601c253c7f0a387812a2288ac" Oct 06 07:11:23 crc kubenswrapper[4845]: I1006 07:11:23.056841 4845 scope.go:117] "RemoveContainer" containerID="7786d1de1fc95aec0e85df20eb485eeb8252c9e3a1ae49ab99de61886a7c60b3" Oct 06 07:11:23 crc kubenswrapper[4845]: I1006 07:11:23.078523 4845 scope.go:117] "RemoveContainer" containerID="8c649eeb37b3414df39222dc7f3fe5c997214b53d0bca6d3794ab775e2eacabb" Oct 06 07:11:23 crc kubenswrapper[4845]: I1006 07:11:23.110429 4845 scope.go:117] "RemoveContainer" containerID="e94d54819e226679334dff47a83802012e4c5af1b9fc34261ff87281f68a047a" Oct 06 07:11:23 crc kubenswrapper[4845]: I1006 07:11:23.136575 4845 scope.go:117] "RemoveContainer" containerID="7b5f220d689497528f0edec5f867002f6f0feb8853739cf1a8f7f6508ee98200" Oct 06 07:11:23 crc kubenswrapper[4845]: I1006 07:11:23.157610 4845 scope.go:117] "RemoveContainer" containerID="2b0025c15fbc51b486fec70b1550e4d0228b888630bc10223ef1a029dd162cc3" Oct 06 07:11:23 crc kubenswrapper[4845]: I1006 07:11:23.179970 4845 scope.go:117] "RemoveContainer" containerID="d6aa494a7c2fda78738b6448afae169d3ca3305dcbc62f44c57a3fbab10089b4" Oct 06 07:11:23 crc kubenswrapper[4845]: I1006 07:11:23.209472 4845 scope.go:117] "RemoveContainer" containerID="2ff3bb2409dda5652b90cbec61798c6873cb910c22f2a41bc90e55601c3df11d" Oct 06 07:11:23 crc kubenswrapper[4845]: I1006 07:11:23.232169 4845 scope.go:117] "RemoveContainer" containerID="3d93a939cd1a83b322fefc8f7c6cb78e918d284d9f6c1176b1f4d748d802e7d6" Oct 06 07:11:23 crc kubenswrapper[4845]: I1006 07:11:23.268906 4845 scope.go:117] "RemoveContainer" containerID="929e16a909839f0532543c2b025fe4a732387ac9475baeb28d68bf0d835ca67f" Oct 06 07:11:31 crc kubenswrapper[4845]: I1006 07:11:31.227608 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:11:31 crc kubenswrapper[4845]: E1006 07:11:31.229747 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:11:33 crc kubenswrapper[4845]: I1006 07:11:33.030442 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-rpmdt"] Oct 06 07:11:33 crc kubenswrapper[4845]: I1006 07:11:33.039053 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-rpmdt"] Oct 06 07:11:34 crc kubenswrapper[4845]: I1006 07:11:34.242952 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ff2f10-a7a1-458b-8a67-78879221e169" path="/var/lib/kubelet/pods/00ff2f10-a7a1-458b-8a67-78879221e169/volumes" Oct 06 07:11:42 crc kubenswrapper[4845]: I1006 07:11:42.035806 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wpzxr"] Oct 06 07:11:42 crc kubenswrapper[4845]: I1006 07:11:42.044009 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wpzxr"] Oct 06 07:11:42 crc kubenswrapper[4845]: I1006 07:11:42.236969 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eecc14ae-0a55-4b21-9ecf-07b6c122e050" path="/var/lib/kubelet/pods/eecc14ae-0a55-4b21-9ecf-07b6c122e050/volumes" Oct 06 07:11:44 crc kubenswrapper[4845]: I1006 07:11:44.226949 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:11:44 crc kubenswrapper[4845]: E1006 07:11:44.228455 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:11:45 crc kubenswrapper[4845]: I1006 07:11:45.409045 4845 generic.go:334] "Generic (PLEG): container finished" podID="4e569fb2-9612-43bd-93ab-bfad8fc42c9c" containerID="7d7b00d1952bf0906015941de28096734dd70057e62a67473920f6c2451b5e30" exitCode=0 Oct 06 07:11:45 crc kubenswrapper[4845]: I1006 07:11:45.409135 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" event={"ID":"4e569fb2-9612-43bd-93ab-bfad8fc42c9c","Type":"ContainerDied","Data":"7d7b00d1952bf0906015941de28096734dd70057e62a67473920f6c2451b5e30"} Oct 06 07:11:46 crc kubenswrapper[4845]: I1006 07:11:46.785938 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:11:46 crc kubenswrapper[4845]: I1006 07:11:46.902884 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-ssh-key\") pod \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\" (UID: \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\") " Oct 06 07:11:46 crc kubenswrapper[4845]: I1006 07:11:46.903052 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkhwd\" (UniqueName: \"kubernetes.io/projected/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-kube-api-access-fkhwd\") pod \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\" (UID: \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\") " Oct 06 07:11:46 crc kubenswrapper[4845]: I1006 07:11:46.903080 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-inventory\") pod \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\" (UID: \"4e569fb2-9612-43bd-93ab-bfad8fc42c9c\") " Oct 06 07:11:46 crc kubenswrapper[4845]: I1006 07:11:46.914647 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-kube-api-access-fkhwd" (OuterVolumeSpecName: "kube-api-access-fkhwd") pod "4e569fb2-9612-43bd-93ab-bfad8fc42c9c" (UID: "4e569fb2-9612-43bd-93ab-bfad8fc42c9c"). InnerVolumeSpecName "kube-api-access-fkhwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:11:46 crc kubenswrapper[4845]: I1006 07:11:46.927695 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-inventory" (OuterVolumeSpecName: "inventory") pod "4e569fb2-9612-43bd-93ab-bfad8fc42c9c" (UID: "4e569fb2-9612-43bd-93ab-bfad8fc42c9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:11:46 crc kubenswrapper[4845]: I1006 07:11:46.932033 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e569fb2-9612-43bd-93ab-bfad8fc42c9c" (UID: "4e569fb2-9612-43bd-93ab-bfad8fc42c9c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.005726 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.005758 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkhwd\" (UniqueName: \"kubernetes.io/projected/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-kube-api-access-fkhwd\") on node \"crc\" DevicePath \"\"" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.005773 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e569fb2-9612-43bd-93ab-bfad8fc42c9c-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.426938 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" event={"ID":"4e569fb2-9612-43bd-93ab-bfad8fc42c9c","Type":"ContainerDied","Data":"be73590acaa9c19879d29728ca319728cb4419331fe6f7fb03de5880d8235386"} Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.427257 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be73590acaa9c19879d29728ca319728cb4419331fe6f7fb03de5880d8235386" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.427017 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.498004 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk"] Oct 06 07:11:47 crc kubenswrapper[4845]: E1006 07:11:47.498386 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e569fb2-9612-43bd-93ab-bfad8fc42c9c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.498403 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e569fb2-9612-43bd-93ab-bfad8fc42c9c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.498594 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e569fb2-9612-43bd-93ab-bfad8fc42c9c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.499187 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.501552 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.501630 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.502539 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.503102 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.515654 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2s7r\" (UniqueName: \"kubernetes.io/projected/f9e93bbd-d62e-451f-bea9-c6a926c912a6-kube-api-access-x2s7r\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qchpk\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.515732 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qchpk\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.515812 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qchpk\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.520823 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk"] Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.617470 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qchpk\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.617616 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2s7r\" (UniqueName: \"kubernetes.io/projected/f9e93bbd-d62e-451f-bea9-c6a926c912a6-kube-api-access-x2s7r\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qchpk\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.617651 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qchpk\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.621171 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qchpk\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.621205 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qchpk\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.639645 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2s7r\" (UniqueName: \"kubernetes.io/projected/f9e93bbd-d62e-451f-bea9-c6a926c912a6-kube-api-access-x2s7r\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-qchpk\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:47 crc kubenswrapper[4845]: I1006 07:11:47.818739 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:48 crc kubenswrapper[4845]: I1006 07:11:48.324564 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk"] Oct 06 07:11:48 crc kubenswrapper[4845]: I1006 07:11:48.438203 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" event={"ID":"f9e93bbd-d62e-451f-bea9-c6a926c912a6","Type":"ContainerStarted","Data":"f14365731906c9d7387a73584b8ac13ea02c066075a3cc1e1c7d5d9705bd3d4d"} Oct 06 07:11:49 crc kubenswrapper[4845]: I1006 07:11:49.448853 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" event={"ID":"f9e93bbd-d62e-451f-bea9-c6a926c912a6","Type":"ContainerStarted","Data":"b761e44df399345a891a3accc14c66f5ee89cf6feb4ac99bfb6cfe17333d2f7e"} Oct 06 07:11:49 crc kubenswrapper[4845]: I1006 07:11:49.466061 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" podStartSLOduration=1.872545811 podStartE2EDuration="2.466040853s" podCreationTimestamp="2025-10-06 07:11:47 +0000 UTC" firstStartedPulling="2025-10-06 07:11:48.333673103 +0000 UTC m=+1592.848414111" lastFinishedPulling="2025-10-06 07:11:48.927168125 +0000 UTC m=+1593.441909153" observedRunningTime="2025-10-06 07:11:49.464490344 +0000 UTC m=+1593.979231372" watchObservedRunningTime="2025-10-06 07:11:49.466040853 +0000 UTC m=+1593.980781861" Oct 06 07:11:54 crc kubenswrapper[4845]: I1006 07:11:54.492414 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" event={"ID":"f9e93bbd-d62e-451f-bea9-c6a926c912a6","Type":"ContainerDied","Data":"b761e44df399345a891a3accc14c66f5ee89cf6feb4ac99bfb6cfe17333d2f7e"} Oct 06 07:11:54 crc kubenswrapper[4845]: I1006 07:11:54.493447 4845 generic.go:334] "Generic (PLEG): container finished" podID="f9e93bbd-d62e-451f-bea9-c6a926c912a6" containerID="b761e44df399345a891a3accc14c66f5ee89cf6feb4ac99bfb6cfe17333d2f7e" exitCode=0 Oct 06 07:11:55 crc kubenswrapper[4845]: I1006 07:11:55.226649 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:11:55 crc kubenswrapper[4845]: E1006 07:11:55.226991 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:11:55 crc kubenswrapper[4845]: I1006 07:11:55.931446 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.116124 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2s7r\" (UniqueName: \"kubernetes.io/projected/f9e93bbd-d62e-451f-bea9-c6a926c912a6-kube-api-access-x2s7r\") pod \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.116501 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-ssh-key\") pod \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.116609 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-inventory\") pod \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.121330 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e93bbd-d62e-451f-bea9-c6a926c912a6-kube-api-access-x2s7r" (OuterVolumeSpecName: "kube-api-access-x2s7r") pod "f9e93bbd-d62e-451f-bea9-c6a926c912a6" (UID: "f9e93bbd-d62e-451f-bea9-c6a926c912a6"). InnerVolumeSpecName "kube-api-access-x2s7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:11:56 crc kubenswrapper[4845]: E1006 07:11:56.144906 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-inventory podName:f9e93bbd-d62e-451f-bea9-c6a926c912a6 nodeName:}" failed. No retries permitted until 2025-10-06 07:11:56.644875757 +0000 UTC m=+1601.159616765 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-inventory") pod "f9e93bbd-d62e-451f-bea9-c6a926c912a6" (UID: "f9e93bbd-d62e-451f-bea9-c6a926c912a6") : error deleting /var/lib/kubelet/pods/f9e93bbd-d62e-451f-bea9-c6a926c912a6/volume-subpaths: remove /var/lib/kubelet/pods/f9e93bbd-d62e-451f-bea9-c6a926c912a6/volume-subpaths: no such file or directory Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.148531 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f9e93bbd-d62e-451f-bea9-c6a926c912a6" (UID: "f9e93bbd-d62e-451f-bea9-c6a926c912a6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.218733 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2s7r\" (UniqueName: \"kubernetes.io/projected/f9e93bbd-d62e-451f-bea9-c6a926c912a6-kube-api-access-x2s7r\") on node \"crc\" DevicePath \"\"" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.218765 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.511821 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" event={"ID":"f9e93bbd-d62e-451f-bea9-c6a926c912a6","Type":"ContainerDied","Data":"f14365731906c9d7387a73584b8ac13ea02c066075a3cc1e1c7d5d9705bd3d4d"} Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.512092 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f14365731906c9d7387a73584b8ac13ea02c066075a3cc1e1c7d5d9705bd3d4d" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.511917 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-qchpk" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.580302 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv"] Oct 06 07:11:56 crc kubenswrapper[4845]: E1006 07:11:56.580774 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e93bbd-d62e-451f-bea9-c6a926c912a6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.580796 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e93bbd-d62e-451f-bea9-c6a926c912a6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.581030 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e93bbd-d62e-451f-bea9-c6a926c912a6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.581693 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.626946 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9079881-df95-4fe0-a6db-2f085d6d974e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hb7dv\" (UID: \"a9079881-df95-4fe0-a6db-2f085d6d974e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.627022 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9079881-df95-4fe0-a6db-2f085d6d974e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hb7dv\" (UID: \"a9079881-df95-4fe0-a6db-2f085d6d974e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.627085 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hwvz\" (UniqueName: \"kubernetes.io/projected/a9079881-df95-4fe0-a6db-2f085d6d974e-kube-api-access-2hwvz\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hb7dv\" (UID: \"a9079881-df95-4fe0-a6db-2f085d6d974e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.632543 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv"] Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.728464 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-inventory\") pod \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\" (UID: \"f9e93bbd-d62e-451f-bea9-c6a926c912a6\") " Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.728766 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9079881-df95-4fe0-a6db-2f085d6d974e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hb7dv\" (UID: \"a9079881-df95-4fe0-a6db-2f085d6d974e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.728833 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9079881-df95-4fe0-a6db-2f085d6d974e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hb7dv\" (UID: \"a9079881-df95-4fe0-a6db-2f085d6d974e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.728901 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hwvz\" (UniqueName: \"kubernetes.io/projected/a9079881-df95-4fe0-a6db-2f085d6d974e-kube-api-access-2hwvz\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hb7dv\" (UID: \"a9079881-df95-4fe0-a6db-2f085d6d974e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.732395 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9079881-df95-4fe0-a6db-2f085d6d974e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hb7dv\" (UID: \"a9079881-df95-4fe0-a6db-2f085d6d974e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.732732 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9079881-df95-4fe0-a6db-2f085d6d974e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hb7dv\" (UID: \"a9079881-df95-4fe0-a6db-2f085d6d974e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.740651 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-inventory" (OuterVolumeSpecName: "inventory") pod "f9e93bbd-d62e-451f-bea9-c6a926c912a6" (UID: "f9e93bbd-d62e-451f-bea9-c6a926c912a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.743932 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hwvz\" (UniqueName: \"kubernetes.io/projected/a9079881-df95-4fe0-a6db-2f085d6d974e-kube-api-access-2hwvz\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hb7dv\" (UID: \"a9079881-df95-4fe0-a6db-2f085d6d974e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.829978 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e93bbd-d62e-451f-bea9-c6a926c912a6-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:11:56 crc kubenswrapper[4845]: I1006 07:11:56.900580 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:11:57 crc kubenswrapper[4845]: I1006 07:11:57.368491 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv"] Oct 06 07:11:57 crc kubenswrapper[4845]: I1006 07:11:57.521166 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" event={"ID":"a9079881-df95-4fe0-a6db-2f085d6d974e","Type":"ContainerStarted","Data":"35553b0def466f49ad9c1f2dff40c952a082e62ecd788725d4845c9e57e237d3"} Oct 06 07:11:58 crc kubenswrapper[4845]: I1006 07:11:58.532913 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" event={"ID":"a9079881-df95-4fe0-a6db-2f085d6d974e","Type":"ContainerStarted","Data":"22cfd8c086ae76f5ed641767ca298d8a6c4486716aa3dc488f83f0efb606f625"} Oct 06 07:12:09 crc kubenswrapper[4845]: I1006 07:12:09.036425 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" podStartSLOduration=12.583523528 podStartE2EDuration="13.036401402s" podCreationTimestamp="2025-10-06 07:11:56 +0000 UTC" firstStartedPulling="2025-10-06 07:11:57.377036463 +0000 UTC m=+1601.891777471" lastFinishedPulling="2025-10-06 07:11:57.829914337 +0000 UTC m=+1602.344655345" observedRunningTime="2025-10-06 07:11:58.547889464 +0000 UTC m=+1603.062630472" watchObservedRunningTime="2025-10-06 07:12:09.036401402 +0000 UTC m=+1613.551142420" Oct 06 07:12:09 crc kubenswrapper[4845]: I1006 07:12:09.038515 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-4wcpl"] Oct 06 07:12:09 crc kubenswrapper[4845]: I1006 07:12:09.049486 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-4wcpl"] Oct 06 07:12:10 crc kubenswrapper[4845]: I1006 07:12:10.226482 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:12:10 crc kubenswrapper[4845]: E1006 07:12:10.227049 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:12:10 crc kubenswrapper[4845]: I1006 07:12:10.237828 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8a9f74-3441-4e68-8014-08472abf6680" path="/var/lib/kubelet/pods/8c8a9f74-3441-4e68-8014-08472abf6680/volumes" Oct 06 07:12:12 crc kubenswrapper[4845]: I1006 07:12:12.028365 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-zdqjb"] Oct 06 07:12:12 crc kubenswrapper[4845]: I1006 07:12:12.036793 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-2ct4d"] Oct 06 07:12:12 crc kubenswrapper[4845]: I1006 07:12:12.044546 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-2ct4d"] Oct 06 07:12:12 crc kubenswrapper[4845]: I1006 07:12:12.052047 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-zdqjb"] Oct 06 07:12:12 crc kubenswrapper[4845]: I1006 07:12:12.253982 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="168d541f-1e89-45b6-9817-91a8242a44fd" path="/var/lib/kubelet/pods/168d541f-1e89-45b6-9817-91a8242a44fd/volumes" Oct 06 07:12:12 crc kubenswrapper[4845]: I1006 07:12:12.255058 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd" path="/var/lib/kubelet/pods/4e0dd3f0-a35a-4c8b-8f80-a33c46a346dd/volumes" Oct 06 07:12:13 crc kubenswrapper[4845]: I1006 07:12:13.031302 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4g7bh"] Oct 06 07:12:13 crc kubenswrapper[4845]: I1006 07:12:13.038293 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4g7bh"] Oct 06 07:12:14 crc kubenswrapper[4845]: I1006 07:12:14.246709 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4713c22-fab5-490b-b167-abae4ecfca11" path="/var/lib/kubelet/pods/d4713c22-fab5-490b-b167-abae4ecfca11/volumes" Oct 06 07:12:17 crc kubenswrapper[4845]: I1006 07:12:17.023303 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c7b0-account-create-ztfjf"] Oct 06 07:12:17 crc kubenswrapper[4845]: I1006 07:12:17.029667 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c7b0-account-create-ztfjf"] Oct 06 07:12:18 crc kubenswrapper[4845]: I1006 07:12:18.239927 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3985a14b-5690-43fb-98ed-b6bd2fe9ea1a" path="/var/lib/kubelet/pods/3985a14b-5690-43fb-98ed-b6bd2fe9ea1a/volumes" Oct 06 07:12:23 crc kubenswrapper[4845]: I1006 07:12:23.524926 4845 scope.go:117] "RemoveContainer" containerID="a7f94e7944128eae6ea27e0cf374469d4e8d970eee1656f60998e0892c4c6f08" Oct 06 07:12:23 crc kubenswrapper[4845]: I1006 07:12:23.551004 4845 scope.go:117] "RemoveContainer" containerID="58caf3b6149fbdb8593cc52903e01819229bfcd094c10597ff05d55cb3e14416" Oct 06 07:12:23 crc kubenswrapper[4845]: I1006 07:12:23.617789 4845 scope.go:117] "RemoveContainer" containerID="f2b1ea5962862cea48cba4b0117a5621b6fc92cd223ad7186549c5d5a2d42030" Oct 06 07:12:23 crc kubenswrapper[4845]: I1006 07:12:23.677834 4845 scope.go:117] "RemoveContainer" containerID="2765602a1e6060186abbd0599bc9af71cd1731ce86d275d0a87f6bdf40477ca8" Oct 06 07:12:23 crc kubenswrapper[4845]: I1006 07:12:23.712733 4845 scope.go:117] "RemoveContainer" containerID="5bf0d22cbdc527c5b53ef8b0f961510fada037f062a19e2ecee78bf35ab0004e" Oct 06 07:12:23 crc kubenswrapper[4845]: I1006 07:12:23.745988 4845 scope.go:117] "RemoveContainer" containerID="8727d7d487d095e6cab6601d2e153a1c869bbc6a443e26047b1e7773ff9f300f" Oct 06 07:12:23 crc kubenswrapper[4845]: I1006 07:12:23.833657 4845 scope.go:117] "RemoveContainer" containerID="f933d86f47f22e4a63f9e72dcd42428743559269c4e8cbd4ca7be74171922570" Oct 06 07:12:24 crc kubenswrapper[4845]: I1006 07:12:24.226472 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:12:24 crc kubenswrapper[4845]: E1006 07:12:24.226885 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:12:26 crc kubenswrapper[4845]: I1006 07:12:26.025645 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-88b1-account-create-f9fm7"] Oct 06 07:12:26 crc kubenswrapper[4845]: I1006 07:12:26.033349 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-88b1-account-create-f9fm7"] Oct 06 07:12:26 crc kubenswrapper[4845]: I1006 07:12:26.235281 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f02af01-d26c-400e-9518-a458cb5db6b9" path="/var/lib/kubelet/pods/4f02af01-d26c-400e-9518-a458cb5db6b9/volumes" Oct 06 07:12:27 crc kubenswrapper[4845]: I1006 07:12:27.029701 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4400-account-create-f658t"] Oct 06 07:12:27 crc kubenswrapper[4845]: I1006 07:12:27.038537 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4400-account-create-f658t"] Oct 06 07:12:28 crc kubenswrapper[4845]: I1006 07:12:28.240055 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94646571-314c-47a7-a651-e919dc640636" path="/var/lib/kubelet/pods/94646571-314c-47a7-a651-e919dc640636/volumes" Oct 06 07:12:33 crc kubenswrapper[4845]: I1006 07:12:33.907244 4845 generic.go:334] "Generic (PLEG): container finished" podID="a9079881-df95-4fe0-a6db-2f085d6d974e" containerID="22cfd8c086ae76f5ed641767ca298d8a6c4486716aa3dc488f83f0efb606f625" exitCode=0 Oct 06 07:12:33 crc kubenswrapper[4845]: I1006 07:12:33.907356 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" event={"ID":"a9079881-df95-4fe0-a6db-2f085d6d974e","Type":"ContainerDied","Data":"22cfd8c086ae76f5ed641767ca298d8a6c4486716aa3dc488f83f0efb606f625"} Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.226762 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:12:35 crc kubenswrapper[4845]: E1006 07:12:35.227299 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.424791 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.483694 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9079881-df95-4fe0-a6db-2f085d6d974e-inventory\") pod \"a9079881-df95-4fe0-a6db-2f085d6d974e\" (UID: \"a9079881-df95-4fe0-a6db-2f085d6d974e\") " Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.483791 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hwvz\" (UniqueName: \"kubernetes.io/projected/a9079881-df95-4fe0-a6db-2f085d6d974e-kube-api-access-2hwvz\") pod \"a9079881-df95-4fe0-a6db-2f085d6d974e\" (UID: \"a9079881-df95-4fe0-a6db-2f085d6d974e\") " Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.484019 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9079881-df95-4fe0-a6db-2f085d6d974e-ssh-key\") pod \"a9079881-df95-4fe0-a6db-2f085d6d974e\" (UID: \"a9079881-df95-4fe0-a6db-2f085d6d974e\") " Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.493638 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9079881-df95-4fe0-a6db-2f085d6d974e-kube-api-access-2hwvz" (OuterVolumeSpecName: "kube-api-access-2hwvz") pod "a9079881-df95-4fe0-a6db-2f085d6d974e" (UID: "a9079881-df95-4fe0-a6db-2f085d6d974e"). InnerVolumeSpecName "kube-api-access-2hwvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.513886 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9079881-df95-4fe0-a6db-2f085d6d974e-inventory" (OuterVolumeSpecName: "inventory") pod "a9079881-df95-4fe0-a6db-2f085d6d974e" (UID: "a9079881-df95-4fe0-a6db-2f085d6d974e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.517680 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9079881-df95-4fe0-a6db-2f085d6d974e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a9079881-df95-4fe0-a6db-2f085d6d974e" (UID: "a9079881-df95-4fe0-a6db-2f085d6d974e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.604342 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hwvz\" (UniqueName: \"kubernetes.io/projected/a9079881-df95-4fe0-a6db-2f085d6d974e-kube-api-access-2hwvz\") on node \"crc\" DevicePath \"\"" Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.604829 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9079881-df95-4fe0-a6db-2f085d6d974e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.604840 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9079881-df95-4fe0-a6db-2f085d6d974e-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.930512 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" event={"ID":"a9079881-df95-4fe0-a6db-2f085d6d974e","Type":"ContainerDied","Data":"35553b0def466f49ad9c1f2dff40c952a082e62ecd788725d4845c9e57e237d3"} Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.930853 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35553b0def466f49ad9c1f2dff40c952a082e62ecd788725d4845c9e57e237d3" Oct 06 07:12:35 crc kubenswrapper[4845]: I1006 07:12:35.930791 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hb7dv" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.033577 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c"] Oct 06 07:12:36 crc kubenswrapper[4845]: E1006 07:12:36.034092 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9079881-df95-4fe0-a6db-2f085d6d974e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.034114 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9079881-df95-4fe0-a6db-2f085d6d974e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.034354 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9079881-df95-4fe0-a6db-2f085d6d974e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.035145 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.038247 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.038293 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.038321 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.038461 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.062127 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c"] Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.113359 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7h8j\" (UniqueName: \"kubernetes.io/projected/21b056d4-86a9-4bdc-a052-8cea0b28efac-kube-api-access-g7h8j\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tg64c\" (UID: \"21b056d4-86a9-4bdc-a052-8cea0b28efac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.113499 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21b056d4-86a9-4bdc-a052-8cea0b28efac-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tg64c\" (UID: \"21b056d4-86a9-4bdc-a052-8cea0b28efac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.113546 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21b056d4-86a9-4bdc-a052-8cea0b28efac-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tg64c\" (UID: \"21b056d4-86a9-4bdc-a052-8cea0b28efac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.215539 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7h8j\" (UniqueName: \"kubernetes.io/projected/21b056d4-86a9-4bdc-a052-8cea0b28efac-kube-api-access-g7h8j\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tg64c\" (UID: \"21b056d4-86a9-4bdc-a052-8cea0b28efac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.215696 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21b056d4-86a9-4bdc-a052-8cea0b28efac-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tg64c\" (UID: \"21b056d4-86a9-4bdc-a052-8cea0b28efac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.215765 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21b056d4-86a9-4bdc-a052-8cea0b28efac-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tg64c\" (UID: \"21b056d4-86a9-4bdc-a052-8cea0b28efac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.223097 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21b056d4-86a9-4bdc-a052-8cea0b28efac-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tg64c\" (UID: \"21b056d4-86a9-4bdc-a052-8cea0b28efac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.229831 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21b056d4-86a9-4bdc-a052-8cea0b28efac-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tg64c\" (UID: \"21b056d4-86a9-4bdc-a052-8cea0b28efac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.234691 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7h8j\" (UniqueName: \"kubernetes.io/projected/21b056d4-86a9-4bdc-a052-8cea0b28efac-kube-api-access-g7h8j\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-tg64c\" (UID: \"21b056d4-86a9-4bdc-a052-8cea0b28efac\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.363340 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.925679 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c"] Oct 06 07:12:36 crc kubenswrapper[4845]: I1006 07:12:36.943385 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" event={"ID":"21b056d4-86a9-4bdc-a052-8cea0b28efac","Type":"ContainerStarted","Data":"642575362f57ad71d3cae5bdc8f505b3f9f4f43f8196b6413413271e65dd7189"} Oct 06 07:12:37 crc kubenswrapper[4845]: I1006 07:12:37.954117 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" event={"ID":"21b056d4-86a9-4bdc-a052-8cea0b28efac","Type":"ContainerStarted","Data":"7e513ecf00aac4aef8e7a80281e68d937460356a7a37cbc0e8ff5580c0798d79"} Oct 06 07:12:37 crc kubenswrapper[4845]: I1006 07:12:37.982181 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" podStartSLOduration=1.555998288 podStartE2EDuration="1.982164211s" podCreationTimestamp="2025-10-06 07:12:36 +0000 UTC" firstStartedPulling="2025-10-06 07:12:36.937134409 +0000 UTC m=+1641.451875417" lastFinishedPulling="2025-10-06 07:12:37.363300322 +0000 UTC m=+1641.878041340" observedRunningTime="2025-10-06 07:12:37.972254478 +0000 UTC m=+1642.486995486" watchObservedRunningTime="2025-10-06 07:12:37.982164211 +0000 UTC m=+1642.496905219" Oct 06 07:12:48 crc kubenswrapper[4845]: I1006 07:12:48.227292 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:12:48 crc kubenswrapper[4845]: E1006 07:12:48.228154 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:12:57 crc kubenswrapper[4845]: I1006 07:12:57.052245 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ct7br"] Oct 06 07:12:57 crc kubenswrapper[4845]: I1006 07:12:57.062132 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ct7br"] Oct 06 07:12:58 crc kubenswrapper[4845]: I1006 07:12:58.239643 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f578e1-f6a1-4986-99af-0ab3a17cef8a" path="/var/lib/kubelet/pods/18f578e1-f6a1-4986-99af-0ab3a17cef8a/volumes" Oct 06 07:12:59 crc kubenswrapper[4845]: I1006 07:12:59.227303 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:12:59 crc kubenswrapper[4845]: E1006 07:12:59.227944 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:13:12 crc kubenswrapper[4845]: I1006 07:13:12.227244 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:13:12 crc kubenswrapper[4845]: E1006 07:13:12.229015 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:13:15 crc kubenswrapper[4845]: I1006 07:13:15.059848 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-v4wmf"] Oct 06 07:13:15 crc kubenswrapper[4845]: I1006 07:13:15.066408 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lck5q"] Oct 06 07:13:15 crc kubenswrapper[4845]: I1006 07:13:15.073520 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-v4wmf"] Oct 06 07:13:15 crc kubenswrapper[4845]: I1006 07:13:15.081562 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lck5q"] Oct 06 07:13:16 crc kubenswrapper[4845]: I1006 07:13:16.236284 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69de2345-faa8-497d-8438-ac6a9d47f7e9" path="/var/lib/kubelet/pods/69de2345-faa8-497d-8438-ac6a9d47f7e9/volumes" Oct 06 07:13:16 crc kubenswrapper[4845]: I1006 07:13:16.237660 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc8f98cf-3856-45f2-9825-c49dfc2cf611" path="/var/lib/kubelet/pods/fc8f98cf-3856-45f2-9825-c49dfc2cf611/volumes" Oct 06 07:13:23 crc kubenswrapper[4845]: I1006 07:13:23.988015 4845 scope.go:117] "RemoveContainer" containerID="66390510d358fc1297b9971709fb9694480c97c4bc8bc8f0041371d0d3708327" Oct 06 07:13:24 crc kubenswrapper[4845]: I1006 07:13:24.033828 4845 scope.go:117] "RemoveContainer" containerID="5be17671e0451f2856f998311419d1377daba3d645ddaa74ae2e996455df601d" Oct 06 07:13:24 crc kubenswrapper[4845]: I1006 07:13:24.076560 4845 scope.go:117] "RemoveContainer" containerID="dabbbf01e35174a9e6a0f846904f140b5cc24c9c6cd0e0d856d5a0c87630efbf" Oct 06 07:13:24 crc kubenswrapper[4845]: I1006 07:13:24.145225 4845 scope.go:117] "RemoveContainer" containerID="7df2631063b929129bf29e4ef03ad22b72fd099d9aa469685ceb3afc67869674" Oct 06 07:13:24 crc kubenswrapper[4845]: I1006 07:13:24.168037 4845 scope.go:117] "RemoveContainer" containerID="e5aff9362e0cf383edc934cb47ab890d7da14b1ae0ef3dcef167fc52d69d4808" Oct 06 07:13:25 crc kubenswrapper[4845]: I1006 07:13:25.227532 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:13:25 crc kubenswrapper[4845]: E1006 07:13:25.228080 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:13:32 crc kubenswrapper[4845]: I1006 07:13:32.490218 4845 generic.go:334] "Generic (PLEG): container finished" podID="21b056d4-86a9-4bdc-a052-8cea0b28efac" containerID="7e513ecf00aac4aef8e7a80281e68d937460356a7a37cbc0e8ff5580c0798d79" exitCode=2 Oct 06 07:13:32 crc kubenswrapper[4845]: I1006 07:13:32.490275 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" event={"ID":"21b056d4-86a9-4bdc-a052-8cea0b28efac","Type":"ContainerDied","Data":"7e513ecf00aac4aef8e7a80281e68d937460356a7a37cbc0e8ff5580c0798d79"} Oct 06 07:13:33 crc kubenswrapper[4845]: I1006 07:13:33.880468 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:13:34 crc kubenswrapper[4845]: I1006 07:13:34.032948 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7h8j\" (UniqueName: \"kubernetes.io/projected/21b056d4-86a9-4bdc-a052-8cea0b28efac-kube-api-access-g7h8j\") pod \"21b056d4-86a9-4bdc-a052-8cea0b28efac\" (UID: \"21b056d4-86a9-4bdc-a052-8cea0b28efac\") " Oct 06 07:13:34 crc kubenswrapper[4845]: I1006 07:13:34.032987 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21b056d4-86a9-4bdc-a052-8cea0b28efac-ssh-key\") pod \"21b056d4-86a9-4bdc-a052-8cea0b28efac\" (UID: \"21b056d4-86a9-4bdc-a052-8cea0b28efac\") " Oct 06 07:13:34 crc kubenswrapper[4845]: I1006 07:13:34.033149 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21b056d4-86a9-4bdc-a052-8cea0b28efac-inventory\") pod \"21b056d4-86a9-4bdc-a052-8cea0b28efac\" (UID: \"21b056d4-86a9-4bdc-a052-8cea0b28efac\") " Oct 06 07:13:34 crc kubenswrapper[4845]: I1006 07:13:34.038159 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b056d4-86a9-4bdc-a052-8cea0b28efac-kube-api-access-g7h8j" (OuterVolumeSpecName: "kube-api-access-g7h8j") pod "21b056d4-86a9-4bdc-a052-8cea0b28efac" (UID: "21b056d4-86a9-4bdc-a052-8cea0b28efac"). InnerVolumeSpecName "kube-api-access-g7h8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:13:34 crc kubenswrapper[4845]: I1006 07:13:34.058307 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b056d4-86a9-4bdc-a052-8cea0b28efac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "21b056d4-86a9-4bdc-a052-8cea0b28efac" (UID: "21b056d4-86a9-4bdc-a052-8cea0b28efac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:13:34 crc kubenswrapper[4845]: I1006 07:13:34.058492 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b056d4-86a9-4bdc-a052-8cea0b28efac-inventory" (OuterVolumeSpecName: "inventory") pod "21b056d4-86a9-4bdc-a052-8cea0b28efac" (UID: "21b056d4-86a9-4bdc-a052-8cea0b28efac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:13:34 crc kubenswrapper[4845]: I1006 07:13:34.135026 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7h8j\" (UniqueName: \"kubernetes.io/projected/21b056d4-86a9-4bdc-a052-8cea0b28efac-kube-api-access-g7h8j\") on node \"crc\" DevicePath \"\"" Oct 06 07:13:34 crc kubenswrapper[4845]: I1006 07:13:34.135059 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21b056d4-86a9-4bdc-a052-8cea0b28efac-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:13:34 crc kubenswrapper[4845]: I1006 07:13:34.135069 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21b056d4-86a9-4bdc-a052-8cea0b28efac-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:13:34 crc kubenswrapper[4845]: I1006 07:13:34.523929 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" event={"ID":"21b056d4-86a9-4bdc-a052-8cea0b28efac","Type":"ContainerDied","Data":"642575362f57ad71d3cae5bdc8f505b3f9f4f43f8196b6413413271e65dd7189"} Oct 06 07:13:34 crc kubenswrapper[4845]: I1006 07:13:34.523982 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-tg64c" Oct 06 07:13:34 crc kubenswrapper[4845]: I1006 07:13:34.524017 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="642575362f57ad71d3cae5bdc8f505b3f9f4f43f8196b6413413271e65dd7189" Oct 06 07:13:39 crc kubenswrapper[4845]: I1006 07:13:39.228337 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:13:39 crc kubenswrapper[4845]: E1006 07:13:39.229318 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.024639 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk"] Oct 06 07:13:41 crc kubenswrapper[4845]: E1006 07:13:41.025547 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b056d4-86a9-4bdc-a052-8cea0b28efac" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.025562 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b056d4-86a9-4bdc-a052-8cea0b28efac" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.025743 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b056d4-86a9-4bdc-a052-8cea0b28efac" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.026403 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.029929 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.030992 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.031205 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.031259 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.051208 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk"] Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.186810 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57c9565a-619a-48cd-af5f-1dc8141f82af-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h4msk\" (UID: \"57c9565a-619a-48cd-af5f-1dc8141f82af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.186885 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn69l\" (UniqueName: \"kubernetes.io/projected/57c9565a-619a-48cd-af5f-1dc8141f82af-kube-api-access-vn69l\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h4msk\" (UID: \"57c9565a-619a-48cd-af5f-1dc8141f82af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.186911 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57c9565a-619a-48cd-af5f-1dc8141f82af-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h4msk\" (UID: \"57c9565a-619a-48cd-af5f-1dc8141f82af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.289060 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn69l\" (UniqueName: \"kubernetes.io/projected/57c9565a-619a-48cd-af5f-1dc8141f82af-kube-api-access-vn69l\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h4msk\" (UID: \"57c9565a-619a-48cd-af5f-1dc8141f82af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.289105 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57c9565a-619a-48cd-af5f-1dc8141f82af-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h4msk\" (UID: \"57c9565a-619a-48cd-af5f-1dc8141f82af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.289860 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57c9565a-619a-48cd-af5f-1dc8141f82af-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h4msk\" (UID: \"57c9565a-619a-48cd-af5f-1dc8141f82af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.295697 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57c9565a-619a-48cd-af5f-1dc8141f82af-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h4msk\" (UID: \"57c9565a-619a-48cd-af5f-1dc8141f82af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.300663 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57c9565a-619a-48cd-af5f-1dc8141f82af-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h4msk\" (UID: \"57c9565a-619a-48cd-af5f-1dc8141f82af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.312969 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn69l\" (UniqueName: \"kubernetes.io/projected/57c9565a-619a-48cd-af5f-1dc8141f82af-kube-api-access-vn69l\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-h4msk\" (UID: \"57c9565a-619a-48cd-af5f-1dc8141f82af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.354249 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:13:41 crc kubenswrapper[4845]: W1006 07:13:41.829897 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57c9565a_619a_48cd_af5f_1dc8141f82af.slice/crio-a82295a05759c155c96e340d10d8c5b3eed698cc8823ea3d478a9a111881b39a WatchSource:0}: Error finding container a82295a05759c155c96e340d10d8c5b3eed698cc8823ea3d478a9a111881b39a: Status 404 returned error can't find the container with id a82295a05759c155c96e340d10d8c5b3eed698cc8823ea3d478a9a111881b39a Oct 06 07:13:41 crc kubenswrapper[4845]: I1006 07:13:41.830249 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk"] Oct 06 07:13:42 crc kubenswrapper[4845]: I1006 07:13:42.599948 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" event={"ID":"57c9565a-619a-48cd-af5f-1dc8141f82af","Type":"ContainerStarted","Data":"a82295a05759c155c96e340d10d8c5b3eed698cc8823ea3d478a9a111881b39a"} Oct 06 07:13:43 crc kubenswrapper[4845]: I1006 07:13:43.611917 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" event={"ID":"57c9565a-619a-48cd-af5f-1dc8141f82af","Type":"ContainerStarted","Data":"c8721323c868239e28d33ebddeade5d8ac4367b3b767fd80af49925d26cef40c"} Oct 06 07:13:43 crc kubenswrapper[4845]: I1006 07:13:43.632427 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" podStartSLOduration=1.8772816369999998 podStartE2EDuration="2.632400316s" podCreationTimestamp="2025-10-06 07:13:41 +0000 UTC" firstStartedPulling="2025-10-06 07:13:41.832108452 +0000 UTC m=+1706.346849460" lastFinishedPulling="2025-10-06 07:13:42.587227131 +0000 UTC m=+1707.101968139" observedRunningTime="2025-10-06 07:13:43.627150661 +0000 UTC m=+1708.141891689" watchObservedRunningTime="2025-10-06 07:13:43.632400316 +0000 UTC m=+1708.147141354" Oct 06 07:13:52 crc kubenswrapper[4845]: I1006 07:13:52.226999 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:13:52 crc kubenswrapper[4845]: E1006 07:13:52.227746 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:13:58 crc kubenswrapper[4845]: I1006 07:13:58.037279 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-bnlg8"] Oct 06 07:13:58 crc kubenswrapper[4845]: I1006 07:13:58.044361 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-bnlg8"] Oct 06 07:13:58 crc kubenswrapper[4845]: I1006 07:13:58.237506 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71e8ec4e-fe1b-46eb-9a91-e13178876378" path="/var/lib/kubelet/pods/71e8ec4e-fe1b-46eb-9a91-e13178876378/volumes" Oct 06 07:14:04 crc kubenswrapper[4845]: I1006 07:14:04.227296 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:14:04 crc kubenswrapper[4845]: E1006 07:14:04.231749 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:14:12 crc kubenswrapper[4845]: I1006 07:14:12.799139 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-284sp"] Oct 06 07:14:12 crc kubenswrapper[4845]: I1006 07:14:12.802499 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:12 crc kubenswrapper[4845]: I1006 07:14:12.815956 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-284sp"] Oct 06 07:14:12 crc kubenswrapper[4845]: I1006 07:14:12.993259 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-catalog-content\") pod \"redhat-operators-284sp\" (UID: \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\") " pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:12 crc kubenswrapper[4845]: I1006 07:14:12.993308 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ddpz\" (UniqueName: \"kubernetes.io/projected/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-kube-api-access-2ddpz\") pod \"redhat-operators-284sp\" (UID: \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\") " pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:12 crc kubenswrapper[4845]: I1006 07:14:12.993401 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-utilities\") pod \"redhat-operators-284sp\" (UID: \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\") " pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:12 crc kubenswrapper[4845]: I1006 07:14:12.994423 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h7zbd"] Oct 06 07:14:12 crc kubenswrapper[4845]: I1006 07:14:12.996601 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.004060 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7zbd"] Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.095656 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-catalog-content\") pod \"redhat-operators-284sp\" (UID: \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\") " pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.095700 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ddpz\" (UniqueName: \"kubernetes.io/projected/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-kube-api-access-2ddpz\") pod \"redhat-operators-284sp\" (UID: \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\") " pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.095784 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-utilities\") pod \"redhat-operators-284sp\" (UID: \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\") " pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.095811 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq8pr\" (UniqueName: \"kubernetes.io/projected/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-kube-api-access-nq8pr\") pod \"redhat-marketplace-h7zbd\" (UID: \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\") " pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.095840 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-utilities\") pod \"redhat-marketplace-h7zbd\" (UID: \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\") " pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.095858 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-catalog-content\") pod \"redhat-marketplace-h7zbd\" (UID: \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\") " pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.096439 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-catalog-content\") pod \"redhat-operators-284sp\" (UID: \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\") " pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.096713 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-utilities\") pod \"redhat-operators-284sp\" (UID: \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\") " pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.118214 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ddpz\" (UniqueName: \"kubernetes.io/projected/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-kube-api-access-2ddpz\") pod \"redhat-operators-284sp\" (UID: \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\") " pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.163948 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.197095 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq8pr\" (UniqueName: \"kubernetes.io/projected/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-kube-api-access-nq8pr\") pod \"redhat-marketplace-h7zbd\" (UID: \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\") " pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.197145 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-utilities\") pod \"redhat-marketplace-h7zbd\" (UID: \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\") " pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.197167 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-catalog-content\") pod \"redhat-marketplace-h7zbd\" (UID: \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\") " pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.198088 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-utilities\") pod \"redhat-marketplace-h7zbd\" (UID: \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\") " pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.198134 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-catalog-content\") pod \"redhat-marketplace-h7zbd\" (UID: \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\") " pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.217474 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq8pr\" (UniqueName: \"kubernetes.io/projected/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-kube-api-access-nq8pr\") pod \"redhat-marketplace-h7zbd\" (UID: \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\") " pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.314071 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.696362 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-284sp"] Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.800270 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7zbd"] Oct 06 07:14:13 crc kubenswrapper[4845]: W1006 07:14:13.800643 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bcf1f0a_e2f9_4dd4_be9e_abb03165f9a4.slice/crio-2400d2b62db27d575af9dca71dac7d10b9822f012433c3241e38453d3fdcbc10 WatchSource:0}: Error finding container 2400d2b62db27d575af9dca71dac7d10b9822f012433c3241e38453d3fdcbc10: Status 404 returned error can't find the container with id 2400d2b62db27d575af9dca71dac7d10b9822f012433c3241e38453d3fdcbc10 Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.869203 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284sp" event={"ID":"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4","Type":"ContainerStarted","Data":"9da410cb10b8cc3b2d26f864526f3bd3abdcf25681bb4807fb33200b2af10df7"} Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.869270 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284sp" event={"ID":"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4","Type":"ContainerStarted","Data":"f1c2fea8cce1ec9ec2af731fe9bd63d996891abf4e806b35bee3184ad362ba04"} Oct 06 07:14:13 crc kubenswrapper[4845]: I1006 07:14:13.871122 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7zbd" event={"ID":"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4","Type":"ContainerStarted","Data":"2400d2b62db27d575af9dca71dac7d10b9822f012433c3241e38453d3fdcbc10"} Oct 06 07:14:14 crc kubenswrapper[4845]: I1006 07:14:14.880442 4845 generic.go:334] "Generic (PLEG): container finished" podID="8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" containerID="9da410cb10b8cc3b2d26f864526f3bd3abdcf25681bb4807fb33200b2af10df7" exitCode=0 Oct 06 07:14:14 crc kubenswrapper[4845]: I1006 07:14:14.880488 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284sp" event={"ID":"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4","Type":"ContainerDied","Data":"9da410cb10b8cc3b2d26f864526f3bd3abdcf25681bb4807fb33200b2af10df7"} Oct 06 07:14:14 crc kubenswrapper[4845]: I1006 07:14:14.882982 4845 generic.go:334] "Generic (PLEG): container finished" podID="3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" containerID="25a6449764ce99eb39e50081a0ba3ef6ec1588b15ac621977c7d34a0277c996c" exitCode=0 Oct 06 07:14:14 crc kubenswrapper[4845]: I1006 07:14:14.883015 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7zbd" event={"ID":"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4","Type":"ContainerDied","Data":"25a6449764ce99eb39e50081a0ba3ef6ec1588b15ac621977c7d34a0277c996c"} Oct 06 07:14:15 crc kubenswrapper[4845]: I1006 07:14:15.892937 4845 generic.go:334] "Generic (PLEG): container finished" podID="3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" containerID="10874a34f8a8a9918f9ec46f3f4aed6f831f8d2bcb7269f4c0eaf20507853af5" exitCode=0 Oct 06 07:14:15 crc kubenswrapper[4845]: I1006 07:14:15.892976 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7zbd" event={"ID":"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4","Type":"ContainerDied","Data":"10874a34f8a8a9918f9ec46f3f4aed6f831f8d2bcb7269f4c0eaf20507853af5"} Oct 06 07:14:16 crc kubenswrapper[4845]: I1006 07:14:16.905413 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7zbd" event={"ID":"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4","Type":"ContainerStarted","Data":"e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8"} Oct 06 07:14:16 crc kubenswrapper[4845]: I1006 07:14:16.908974 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284sp" event={"ID":"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4","Type":"ContainerStarted","Data":"7bab055d80f58eb1e1b300ea9e5ed6843b64db45e06add473642b7979ae4dfa8"} Oct 06 07:14:16 crc kubenswrapper[4845]: I1006 07:14:16.926167 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h7zbd" podStartSLOduration=3.342494856 podStartE2EDuration="4.926146971s" podCreationTimestamp="2025-10-06 07:14:12 +0000 UTC" firstStartedPulling="2025-10-06 07:14:14.885260791 +0000 UTC m=+1739.400001799" lastFinishedPulling="2025-10-06 07:14:16.468912906 +0000 UTC m=+1740.983653914" observedRunningTime="2025-10-06 07:14:16.925658118 +0000 UTC m=+1741.440399146" watchObservedRunningTime="2025-10-06 07:14:16.926146971 +0000 UTC m=+1741.440887979" Oct 06 07:14:17 crc kubenswrapper[4845]: I1006 07:14:17.924165 4845 generic.go:334] "Generic (PLEG): container finished" podID="8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" containerID="7bab055d80f58eb1e1b300ea9e5ed6843b64db45e06add473642b7979ae4dfa8" exitCode=0 Oct 06 07:14:17 crc kubenswrapper[4845]: I1006 07:14:17.924245 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284sp" event={"ID":"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4","Type":"ContainerDied","Data":"7bab055d80f58eb1e1b300ea9e5ed6843b64db45e06add473642b7979ae4dfa8"} Oct 06 07:14:18 crc kubenswrapper[4845]: I1006 07:14:18.227113 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:14:18 crc kubenswrapper[4845]: E1006 07:14:18.227672 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:14:19 crc kubenswrapper[4845]: I1006 07:14:19.954348 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284sp" event={"ID":"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4","Type":"ContainerStarted","Data":"c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264"} Oct 06 07:14:19 crc kubenswrapper[4845]: I1006 07:14:19.979634 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-284sp" podStartSLOduration=4.095043251 podStartE2EDuration="7.979616311s" podCreationTimestamp="2025-10-06 07:14:12 +0000 UTC" firstStartedPulling="2025-10-06 07:14:14.882441238 +0000 UTC m=+1739.397182246" lastFinishedPulling="2025-10-06 07:14:18.767014298 +0000 UTC m=+1743.281755306" observedRunningTime="2025-10-06 07:14:19.973957935 +0000 UTC m=+1744.488698943" watchObservedRunningTime="2025-10-06 07:14:19.979616311 +0000 UTC m=+1744.494357309" Oct 06 07:14:23 crc kubenswrapper[4845]: I1006 07:14:23.164892 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:23 crc kubenswrapper[4845]: I1006 07:14:23.165177 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:23 crc kubenswrapper[4845]: I1006 07:14:23.210010 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:23 crc kubenswrapper[4845]: I1006 07:14:23.315569 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:23 crc kubenswrapper[4845]: I1006 07:14:23.315624 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:23 crc kubenswrapper[4845]: I1006 07:14:23.358268 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:24 crc kubenswrapper[4845]: I1006 07:14:24.025519 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:24 crc kubenswrapper[4845]: I1006 07:14:24.027442 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:24 crc kubenswrapper[4845]: I1006 07:14:24.341990 4845 scope.go:117] "RemoveContainer" containerID="9ec90fe4ab97cc5b720940614641aacb14ec677cc7789f10ec8c630706b0455a" Oct 06 07:14:24 crc kubenswrapper[4845]: I1006 07:14:24.982988 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-284sp"] Oct 06 07:14:25 crc kubenswrapper[4845]: I1006 07:14:25.997817 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-284sp" podUID="8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" containerName="registry-server" containerID="cri-o://c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264" gracePeriod=2 Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.397100 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7zbd"] Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.397406 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h7zbd" podUID="3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" containerName="registry-server" containerID="cri-o://e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8" gracePeriod=2 Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.416053 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.571861 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-utilities\") pod \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\" (UID: \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\") " Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.571931 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ddpz\" (UniqueName: \"kubernetes.io/projected/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-kube-api-access-2ddpz\") pod \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\" (UID: \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\") " Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.572138 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-catalog-content\") pod \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\" (UID: \"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4\") " Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.572611 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-utilities" (OuterVolumeSpecName: "utilities") pod "8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" (UID: "8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.572817 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.579616 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-kube-api-access-2ddpz" (OuterVolumeSpecName: "kube-api-access-2ddpz") pod "8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" (UID: "8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4"). InnerVolumeSpecName "kube-api-access-2ddpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.664180 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" (UID: "8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.674888 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.674928 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ddpz\" (UniqueName: \"kubernetes.io/projected/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4-kube-api-access-2ddpz\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.745475 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.877968 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-utilities\") pod \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\" (UID: \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\") " Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.878022 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-catalog-content\") pod \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\" (UID: \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\") " Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.878122 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq8pr\" (UniqueName: \"kubernetes.io/projected/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-kube-api-access-nq8pr\") pod \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\" (UID: \"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4\") " Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.879545 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-utilities" (OuterVolumeSpecName: "utilities") pod "3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" (UID: "3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.881978 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-kube-api-access-nq8pr" (OuterVolumeSpecName: "kube-api-access-nq8pr") pod "3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" (UID: "3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4"). InnerVolumeSpecName "kube-api-access-nq8pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.889602 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" (UID: "3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.985636 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq8pr\" (UniqueName: \"kubernetes.io/projected/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-kube-api-access-nq8pr\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.985680 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:26 crc kubenswrapper[4845]: I1006 07:14:26.985695 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.007207 4845 generic.go:334] "Generic (PLEG): container finished" podID="8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" containerID="c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264" exitCode=0 Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.007275 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284sp" event={"ID":"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4","Type":"ContainerDied","Data":"c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264"} Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.007302 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-284sp" event={"ID":"8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4","Type":"ContainerDied","Data":"f1c2fea8cce1ec9ec2af731fe9bd63d996891abf4e806b35bee3184ad362ba04"} Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.007304 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-284sp" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.007319 4845 scope.go:117] "RemoveContainer" containerID="c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.011888 4845 generic.go:334] "Generic (PLEG): container finished" podID="3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" containerID="e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8" exitCode=0 Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.011934 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7zbd" event={"ID":"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4","Type":"ContainerDied","Data":"e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8"} Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.012017 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7zbd" event={"ID":"3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4","Type":"ContainerDied","Data":"2400d2b62db27d575af9dca71dac7d10b9822f012433c3241e38453d3fdcbc10"} Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.012101 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7zbd" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.030560 4845 scope.go:117] "RemoveContainer" containerID="7bab055d80f58eb1e1b300ea9e5ed6843b64db45e06add473642b7979ae4dfa8" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.052898 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-284sp"] Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.059898 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-284sp"] Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.065593 4845 scope.go:117] "RemoveContainer" containerID="9da410cb10b8cc3b2d26f864526f3bd3abdcf25681bb4807fb33200b2af10df7" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.067702 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7zbd"] Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.074840 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7zbd"] Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.084692 4845 scope.go:117] "RemoveContainer" containerID="c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264" Oct 06 07:14:27 crc kubenswrapper[4845]: E1006 07:14:27.085099 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264\": container with ID starting with c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264 not found: ID does not exist" containerID="c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.085195 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264"} err="failed to get container status \"c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264\": rpc error: code = NotFound desc = could not find container \"c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264\": container with ID starting with c3a53ce3323eac903de808d7ca570b73bf916ba8ed62210c53d96f9914651264 not found: ID does not exist" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.085277 4845 scope.go:117] "RemoveContainer" containerID="7bab055d80f58eb1e1b300ea9e5ed6843b64db45e06add473642b7979ae4dfa8" Oct 06 07:14:27 crc kubenswrapper[4845]: E1006 07:14:27.085735 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bab055d80f58eb1e1b300ea9e5ed6843b64db45e06add473642b7979ae4dfa8\": container with ID starting with 7bab055d80f58eb1e1b300ea9e5ed6843b64db45e06add473642b7979ae4dfa8 not found: ID does not exist" containerID="7bab055d80f58eb1e1b300ea9e5ed6843b64db45e06add473642b7979ae4dfa8" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.085773 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bab055d80f58eb1e1b300ea9e5ed6843b64db45e06add473642b7979ae4dfa8"} err="failed to get container status \"7bab055d80f58eb1e1b300ea9e5ed6843b64db45e06add473642b7979ae4dfa8\": rpc error: code = NotFound desc = could not find container \"7bab055d80f58eb1e1b300ea9e5ed6843b64db45e06add473642b7979ae4dfa8\": container with ID starting with 7bab055d80f58eb1e1b300ea9e5ed6843b64db45e06add473642b7979ae4dfa8 not found: ID does not exist" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.085800 4845 scope.go:117] "RemoveContainer" containerID="9da410cb10b8cc3b2d26f864526f3bd3abdcf25681bb4807fb33200b2af10df7" Oct 06 07:14:27 crc kubenswrapper[4845]: E1006 07:14:27.086535 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9da410cb10b8cc3b2d26f864526f3bd3abdcf25681bb4807fb33200b2af10df7\": container with ID starting with 9da410cb10b8cc3b2d26f864526f3bd3abdcf25681bb4807fb33200b2af10df7 not found: ID does not exist" containerID="9da410cb10b8cc3b2d26f864526f3bd3abdcf25681bb4807fb33200b2af10df7" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.086563 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9da410cb10b8cc3b2d26f864526f3bd3abdcf25681bb4807fb33200b2af10df7"} err="failed to get container status \"9da410cb10b8cc3b2d26f864526f3bd3abdcf25681bb4807fb33200b2af10df7\": rpc error: code = NotFound desc = could not find container \"9da410cb10b8cc3b2d26f864526f3bd3abdcf25681bb4807fb33200b2af10df7\": container with ID starting with 9da410cb10b8cc3b2d26f864526f3bd3abdcf25681bb4807fb33200b2af10df7 not found: ID does not exist" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.086579 4845 scope.go:117] "RemoveContainer" containerID="e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.102968 4845 scope.go:117] "RemoveContainer" containerID="10874a34f8a8a9918f9ec46f3f4aed6f831f8d2bcb7269f4c0eaf20507853af5" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.169578 4845 scope.go:117] "RemoveContainer" containerID="25a6449764ce99eb39e50081a0ba3ef6ec1588b15ac621977c7d34a0277c996c" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.223393 4845 scope.go:117] "RemoveContainer" containerID="e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8" Oct 06 07:14:27 crc kubenswrapper[4845]: E1006 07:14:27.223875 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8\": container with ID starting with e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8 not found: ID does not exist" containerID="e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.223909 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8"} err="failed to get container status \"e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8\": rpc error: code = NotFound desc = could not find container \"e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8\": container with ID starting with e66c70e48ea6b5911896cab9ebcf8c82b2d98641481b538bfa468dc9e81998f8 not found: ID does not exist" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.223935 4845 scope.go:117] "RemoveContainer" containerID="10874a34f8a8a9918f9ec46f3f4aed6f831f8d2bcb7269f4c0eaf20507853af5" Oct 06 07:14:27 crc kubenswrapper[4845]: E1006 07:14:27.224209 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10874a34f8a8a9918f9ec46f3f4aed6f831f8d2bcb7269f4c0eaf20507853af5\": container with ID starting with 10874a34f8a8a9918f9ec46f3f4aed6f831f8d2bcb7269f4c0eaf20507853af5 not found: ID does not exist" containerID="10874a34f8a8a9918f9ec46f3f4aed6f831f8d2bcb7269f4c0eaf20507853af5" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.224257 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10874a34f8a8a9918f9ec46f3f4aed6f831f8d2bcb7269f4c0eaf20507853af5"} err="failed to get container status \"10874a34f8a8a9918f9ec46f3f4aed6f831f8d2bcb7269f4c0eaf20507853af5\": rpc error: code = NotFound desc = could not find container \"10874a34f8a8a9918f9ec46f3f4aed6f831f8d2bcb7269f4c0eaf20507853af5\": container with ID starting with 10874a34f8a8a9918f9ec46f3f4aed6f831f8d2bcb7269f4c0eaf20507853af5 not found: ID does not exist" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.224278 4845 scope.go:117] "RemoveContainer" containerID="25a6449764ce99eb39e50081a0ba3ef6ec1588b15ac621977c7d34a0277c996c" Oct 06 07:14:27 crc kubenswrapper[4845]: E1006 07:14:27.224584 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a6449764ce99eb39e50081a0ba3ef6ec1588b15ac621977c7d34a0277c996c\": container with ID starting with 25a6449764ce99eb39e50081a0ba3ef6ec1588b15ac621977c7d34a0277c996c not found: ID does not exist" containerID="25a6449764ce99eb39e50081a0ba3ef6ec1588b15ac621977c7d34a0277c996c" Oct 06 07:14:27 crc kubenswrapper[4845]: I1006 07:14:27.224610 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a6449764ce99eb39e50081a0ba3ef6ec1588b15ac621977c7d34a0277c996c"} err="failed to get container status \"25a6449764ce99eb39e50081a0ba3ef6ec1588b15ac621977c7d34a0277c996c\": rpc error: code = NotFound desc = could not find container \"25a6449764ce99eb39e50081a0ba3ef6ec1588b15ac621977c7d34a0277c996c\": container with ID starting with 25a6449764ce99eb39e50081a0ba3ef6ec1588b15ac621977c7d34a0277c996c not found: ID does not exist" Oct 06 07:14:28 crc kubenswrapper[4845]: I1006 07:14:28.236571 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" path="/var/lib/kubelet/pods/3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4/volumes" Oct 06 07:14:28 crc kubenswrapper[4845]: I1006 07:14:28.237756 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" path="/var/lib/kubelet/pods/8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4/volumes" Oct 06 07:14:29 crc kubenswrapper[4845]: I1006 07:14:29.028872 4845 generic.go:334] "Generic (PLEG): container finished" podID="57c9565a-619a-48cd-af5f-1dc8141f82af" containerID="c8721323c868239e28d33ebddeade5d8ac4367b3b767fd80af49925d26cef40c" exitCode=0 Oct 06 07:14:29 crc kubenswrapper[4845]: I1006 07:14:29.028966 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" event={"ID":"57c9565a-619a-48cd-af5f-1dc8141f82af","Type":"ContainerDied","Data":"c8721323c868239e28d33ebddeade5d8ac4367b3b767fd80af49925d26cef40c"} Oct 06 07:14:30 crc kubenswrapper[4845]: I1006 07:14:30.443745 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:14:30 crc kubenswrapper[4845]: I1006 07:14:30.547855 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57c9565a-619a-48cd-af5f-1dc8141f82af-inventory\") pod \"57c9565a-619a-48cd-af5f-1dc8141f82af\" (UID: \"57c9565a-619a-48cd-af5f-1dc8141f82af\") " Oct 06 07:14:30 crc kubenswrapper[4845]: I1006 07:14:30.547968 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn69l\" (UniqueName: \"kubernetes.io/projected/57c9565a-619a-48cd-af5f-1dc8141f82af-kube-api-access-vn69l\") pod \"57c9565a-619a-48cd-af5f-1dc8141f82af\" (UID: \"57c9565a-619a-48cd-af5f-1dc8141f82af\") " Oct 06 07:14:30 crc kubenswrapper[4845]: I1006 07:14:30.548178 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57c9565a-619a-48cd-af5f-1dc8141f82af-ssh-key\") pod \"57c9565a-619a-48cd-af5f-1dc8141f82af\" (UID: \"57c9565a-619a-48cd-af5f-1dc8141f82af\") " Oct 06 07:14:30 crc kubenswrapper[4845]: I1006 07:14:30.554139 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57c9565a-619a-48cd-af5f-1dc8141f82af-kube-api-access-vn69l" (OuterVolumeSpecName: "kube-api-access-vn69l") pod "57c9565a-619a-48cd-af5f-1dc8141f82af" (UID: "57c9565a-619a-48cd-af5f-1dc8141f82af"). InnerVolumeSpecName "kube-api-access-vn69l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:14:30 crc kubenswrapper[4845]: I1006 07:14:30.574658 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57c9565a-619a-48cd-af5f-1dc8141f82af-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "57c9565a-619a-48cd-af5f-1dc8141f82af" (UID: "57c9565a-619a-48cd-af5f-1dc8141f82af"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:14:30 crc kubenswrapper[4845]: I1006 07:14:30.575924 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57c9565a-619a-48cd-af5f-1dc8141f82af-inventory" (OuterVolumeSpecName: "inventory") pod "57c9565a-619a-48cd-af5f-1dc8141f82af" (UID: "57c9565a-619a-48cd-af5f-1dc8141f82af"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:14:30 crc kubenswrapper[4845]: I1006 07:14:30.650744 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57c9565a-619a-48cd-af5f-1dc8141f82af-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:30 crc kubenswrapper[4845]: I1006 07:14:30.651328 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57c9565a-619a-48cd-af5f-1dc8141f82af-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:30 crc kubenswrapper[4845]: I1006 07:14:30.651345 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn69l\" (UniqueName: \"kubernetes.io/projected/57c9565a-619a-48cd-af5f-1dc8141f82af-kube-api-access-vn69l\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.044792 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" event={"ID":"57c9565a-619a-48cd-af5f-1dc8141f82af","Type":"ContainerDied","Data":"a82295a05759c155c96e340d10d8c5b3eed698cc8823ea3d478a9a111881b39a"} Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.045177 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a82295a05759c155c96e340d10d8c5b3eed698cc8823ea3d478a9a111881b39a" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.044873 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-h4msk" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.145237 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dmcpx"] Oct 06 07:14:31 crc kubenswrapper[4845]: E1006 07:14:31.145672 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" containerName="registry-server" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.145689 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" containerName="registry-server" Oct 06 07:14:31 crc kubenswrapper[4845]: E1006 07:14:31.145717 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" containerName="registry-server" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.145724 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" containerName="registry-server" Oct 06 07:14:31 crc kubenswrapper[4845]: E1006 07:14:31.145747 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57c9565a-619a-48cd-af5f-1dc8141f82af" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.145755 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="57c9565a-619a-48cd-af5f-1dc8141f82af" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:14:31 crc kubenswrapper[4845]: E1006 07:14:31.145767 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" containerName="extract-utilities" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.145773 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" containerName="extract-utilities" Oct 06 07:14:31 crc kubenswrapper[4845]: E1006 07:14:31.145785 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" containerName="extract-content" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.145791 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" containerName="extract-content" Oct 06 07:14:31 crc kubenswrapper[4845]: E1006 07:14:31.145800 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" containerName="extract-content" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.145805 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" containerName="extract-content" Oct 06 07:14:31 crc kubenswrapper[4845]: E1006 07:14:31.145812 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" containerName="extract-utilities" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.145818 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" containerName="extract-utilities" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.145962 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fa53a2c-5c00-4030-8d24-7dc6cab6f2c4" containerName="registry-server" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.145987 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bcf1f0a-e2f9-4dd4-be9e-abb03165f9a4" containerName="registry-server" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.145998 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="57c9565a-619a-48cd-af5f-1dc8141f82af" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.146681 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.149495 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.149537 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.149688 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.149746 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.161242 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dmcpx"] Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.227907 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:14:31 crc kubenswrapper[4845]: E1006 07:14:31.228135 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.262744 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c482754d-cc4d-4480-b2c3-1ae079c9222b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dmcpx\" (UID: \"c482754d-cc4d-4480-b2c3-1ae079c9222b\") " pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.262820 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c482754d-cc4d-4480-b2c3-1ae079c9222b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dmcpx\" (UID: \"c482754d-cc4d-4480-b2c3-1ae079c9222b\") " pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.262943 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzvss\" (UniqueName: \"kubernetes.io/projected/c482754d-cc4d-4480-b2c3-1ae079c9222b-kube-api-access-hzvss\") pod \"ssh-known-hosts-edpm-deployment-dmcpx\" (UID: \"c482754d-cc4d-4480-b2c3-1ae079c9222b\") " pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.364588 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c482754d-cc4d-4480-b2c3-1ae079c9222b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dmcpx\" (UID: \"c482754d-cc4d-4480-b2c3-1ae079c9222b\") " pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.364655 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c482754d-cc4d-4480-b2c3-1ae079c9222b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dmcpx\" (UID: \"c482754d-cc4d-4480-b2c3-1ae079c9222b\") " pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.364841 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzvss\" (UniqueName: \"kubernetes.io/projected/c482754d-cc4d-4480-b2c3-1ae079c9222b-kube-api-access-hzvss\") pod \"ssh-known-hosts-edpm-deployment-dmcpx\" (UID: \"c482754d-cc4d-4480-b2c3-1ae079c9222b\") " pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.369893 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c482754d-cc4d-4480-b2c3-1ae079c9222b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dmcpx\" (UID: \"c482754d-cc4d-4480-b2c3-1ae079c9222b\") " pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.369939 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c482754d-cc4d-4480-b2c3-1ae079c9222b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dmcpx\" (UID: \"c482754d-cc4d-4480-b2c3-1ae079c9222b\") " pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.381344 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzvss\" (UniqueName: \"kubernetes.io/projected/c482754d-cc4d-4480-b2c3-1ae079c9222b-kube-api-access-hzvss\") pod \"ssh-known-hosts-edpm-deployment-dmcpx\" (UID: \"c482754d-cc4d-4480-b2c3-1ae079c9222b\") " pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.463018 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:31 crc kubenswrapper[4845]: I1006 07:14:31.958244 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dmcpx"] Oct 06 07:14:32 crc kubenswrapper[4845]: I1006 07:14:32.052767 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" event={"ID":"c482754d-cc4d-4480-b2c3-1ae079c9222b","Type":"ContainerStarted","Data":"c97cf38931bf1af6d0a0643fd5de63455d11caaff8a4a82ca1659389ac8262bc"} Oct 06 07:14:33 crc kubenswrapper[4845]: I1006 07:14:33.064820 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" event={"ID":"c482754d-cc4d-4480-b2c3-1ae079c9222b","Type":"ContainerStarted","Data":"1ee13ddb35f0696cc194e25fc498c5984cab75e07433b5539a5e321a53947a10"} Oct 06 07:14:33 crc kubenswrapper[4845]: I1006 07:14:33.089039 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" podStartSLOduration=1.670435372 podStartE2EDuration="2.089009635s" podCreationTimestamp="2025-10-06 07:14:31 +0000 UTC" firstStartedPulling="2025-10-06 07:14:31.961006415 +0000 UTC m=+1756.475747423" lastFinishedPulling="2025-10-06 07:14:32.379580678 +0000 UTC m=+1756.894321686" observedRunningTime="2025-10-06 07:14:33.082690603 +0000 UTC m=+1757.597431611" watchObservedRunningTime="2025-10-06 07:14:33.089009635 +0000 UTC m=+1757.603750663" Oct 06 07:14:40 crc kubenswrapper[4845]: I1006 07:14:40.118828 4845 generic.go:334] "Generic (PLEG): container finished" podID="c482754d-cc4d-4480-b2c3-1ae079c9222b" containerID="1ee13ddb35f0696cc194e25fc498c5984cab75e07433b5539a5e321a53947a10" exitCode=0 Oct 06 07:14:40 crc kubenswrapper[4845]: I1006 07:14:40.118915 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" event={"ID":"c482754d-cc4d-4480-b2c3-1ae079c9222b","Type":"ContainerDied","Data":"1ee13ddb35f0696cc194e25fc498c5984cab75e07433b5539a5e321a53947a10"} Oct 06 07:14:41 crc kubenswrapper[4845]: I1006 07:14:41.596638 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:41 crc kubenswrapper[4845]: I1006 07:14:41.651210 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c482754d-cc4d-4480-b2c3-1ae079c9222b-ssh-key-openstack-edpm-ipam\") pod \"c482754d-cc4d-4480-b2c3-1ae079c9222b\" (UID: \"c482754d-cc4d-4480-b2c3-1ae079c9222b\") " Oct 06 07:14:41 crc kubenswrapper[4845]: I1006 07:14:41.651408 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c482754d-cc4d-4480-b2c3-1ae079c9222b-inventory-0\") pod \"c482754d-cc4d-4480-b2c3-1ae079c9222b\" (UID: \"c482754d-cc4d-4480-b2c3-1ae079c9222b\") " Oct 06 07:14:41 crc kubenswrapper[4845]: I1006 07:14:41.651503 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzvss\" (UniqueName: \"kubernetes.io/projected/c482754d-cc4d-4480-b2c3-1ae079c9222b-kube-api-access-hzvss\") pod \"c482754d-cc4d-4480-b2c3-1ae079c9222b\" (UID: \"c482754d-cc4d-4480-b2c3-1ae079c9222b\") " Oct 06 07:14:41 crc kubenswrapper[4845]: I1006 07:14:41.657993 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c482754d-cc4d-4480-b2c3-1ae079c9222b-kube-api-access-hzvss" (OuterVolumeSpecName: "kube-api-access-hzvss") pod "c482754d-cc4d-4480-b2c3-1ae079c9222b" (UID: "c482754d-cc4d-4480-b2c3-1ae079c9222b"). InnerVolumeSpecName "kube-api-access-hzvss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:14:41 crc kubenswrapper[4845]: I1006 07:14:41.678064 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c482754d-cc4d-4480-b2c3-1ae079c9222b-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "c482754d-cc4d-4480-b2c3-1ae079c9222b" (UID: "c482754d-cc4d-4480-b2c3-1ae079c9222b"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:14:41 crc kubenswrapper[4845]: I1006 07:14:41.678476 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c482754d-cc4d-4480-b2c3-1ae079c9222b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c482754d-cc4d-4480-b2c3-1ae079c9222b" (UID: "c482754d-cc4d-4480-b2c3-1ae079c9222b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:14:41 crc kubenswrapper[4845]: I1006 07:14:41.752903 4845 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c482754d-cc4d-4480-b2c3-1ae079c9222b-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:41 crc kubenswrapper[4845]: I1006 07:14:41.752977 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzvss\" (UniqueName: \"kubernetes.io/projected/c482754d-cc4d-4480-b2c3-1ae079c9222b-kube-api-access-hzvss\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:41 crc kubenswrapper[4845]: I1006 07:14:41.752989 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c482754d-cc4d-4480-b2c3-1ae079c9222b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.136047 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" event={"ID":"c482754d-cc4d-4480-b2c3-1ae079c9222b","Type":"ContainerDied","Data":"c97cf38931bf1af6d0a0643fd5de63455d11caaff8a4a82ca1659389ac8262bc"} Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.136081 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dmcpx" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.136084 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c97cf38931bf1af6d0a0643fd5de63455d11caaff8a4a82ca1659389ac8262bc" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.222198 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n"] Oct 06 07:14:42 crc kubenswrapper[4845]: E1006 07:14:42.222568 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c482754d-cc4d-4480-b2c3-1ae079c9222b" containerName="ssh-known-hosts-edpm-deployment" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.222585 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c482754d-cc4d-4480-b2c3-1ae079c9222b" containerName="ssh-known-hosts-edpm-deployment" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.222820 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c482754d-cc4d-4480-b2c3-1ae079c9222b" containerName="ssh-known-hosts-edpm-deployment" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.223467 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.229792 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.229825 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.229880 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.229973 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.246162 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n"] Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.361918 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88aa0357-70ea-4f4d-80e7-952615d772fe-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bcz6n\" (UID: \"88aa0357-70ea-4f4d-80e7-952615d772fe\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.362663 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88aa0357-70ea-4f4d-80e7-952615d772fe-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bcz6n\" (UID: \"88aa0357-70ea-4f4d-80e7-952615d772fe\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.362810 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x8ck\" (UniqueName: \"kubernetes.io/projected/88aa0357-70ea-4f4d-80e7-952615d772fe-kube-api-access-5x8ck\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bcz6n\" (UID: \"88aa0357-70ea-4f4d-80e7-952615d772fe\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.464960 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88aa0357-70ea-4f4d-80e7-952615d772fe-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bcz6n\" (UID: \"88aa0357-70ea-4f4d-80e7-952615d772fe\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.465084 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88aa0357-70ea-4f4d-80e7-952615d772fe-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bcz6n\" (UID: \"88aa0357-70ea-4f4d-80e7-952615d772fe\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.465163 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x8ck\" (UniqueName: \"kubernetes.io/projected/88aa0357-70ea-4f4d-80e7-952615d772fe-kube-api-access-5x8ck\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bcz6n\" (UID: \"88aa0357-70ea-4f4d-80e7-952615d772fe\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.468503 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88aa0357-70ea-4f4d-80e7-952615d772fe-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bcz6n\" (UID: \"88aa0357-70ea-4f4d-80e7-952615d772fe\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.470570 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88aa0357-70ea-4f4d-80e7-952615d772fe-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bcz6n\" (UID: \"88aa0357-70ea-4f4d-80e7-952615d772fe\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.481063 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x8ck\" (UniqueName: \"kubernetes.io/projected/88aa0357-70ea-4f4d-80e7-952615d772fe-kube-api-access-5x8ck\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bcz6n\" (UID: \"88aa0357-70ea-4f4d-80e7-952615d772fe\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:42 crc kubenswrapper[4845]: I1006 07:14:42.541154 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:43 crc kubenswrapper[4845]: I1006 07:14:43.047740 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n"] Oct 06 07:14:43 crc kubenswrapper[4845]: I1006 07:14:43.146157 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" event={"ID":"88aa0357-70ea-4f4d-80e7-952615d772fe","Type":"ContainerStarted","Data":"5ad38ca9b60fc7d86577a65a0f5e4938f919765930ca50172f7c7826523b1688"} Oct 06 07:14:44 crc kubenswrapper[4845]: I1006 07:14:44.164278 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" event={"ID":"88aa0357-70ea-4f4d-80e7-952615d772fe","Type":"ContainerStarted","Data":"2867d19c8c021d2b6d74816e422b0f492e2284c18edeb85a4f2de4d904cf8f7f"} Oct 06 07:14:44 crc kubenswrapper[4845]: I1006 07:14:44.183927 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" podStartSLOduration=1.597747152 podStartE2EDuration="2.183905375s" podCreationTimestamp="2025-10-06 07:14:42 +0000 UTC" firstStartedPulling="2025-10-06 07:14:43.053690539 +0000 UTC m=+1767.568431547" lastFinishedPulling="2025-10-06 07:14:43.639848762 +0000 UTC m=+1768.154589770" observedRunningTime="2025-10-06 07:14:44.177866 +0000 UTC m=+1768.692607008" watchObservedRunningTime="2025-10-06 07:14:44.183905375 +0000 UTC m=+1768.698646383" Oct 06 07:14:46 crc kubenswrapper[4845]: I1006 07:14:46.232034 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:14:46 crc kubenswrapper[4845]: E1006 07:14:46.232669 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:14:51 crc kubenswrapper[4845]: I1006 07:14:51.226877 4845 generic.go:334] "Generic (PLEG): container finished" podID="88aa0357-70ea-4f4d-80e7-952615d772fe" containerID="2867d19c8c021d2b6d74816e422b0f492e2284c18edeb85a4f2de4d904cf8f7f" exitCode=0 Oct 06 07:14:51 crc kubenswrapper[4845]: I1006 07:14:51.226966 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" event={"ID":"88aa0357-70ea-4f4d-80e7-952615d772fe","Type":"ContainerDied","Data":"2867d19c8c021d2b6d74816e422b0f492e2284c18edeb85a4f2de4d904cf8f7f"} Oct 06 07:14:52 crc kubenswrapper[4845]: I1006 07:14:52.641479 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:52 crc kubenswrapper[4845]: I1006 07:14:52.769915 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88aa0357-70ea-4f4d-80e7-952615d772fe-inventory\") pod \"88aa0357-70ea-4f4d-80e7-952615d772fe\" (UID: \"88aa0357-70ea-4f4d-80e7-952615d772fe\") " Oct 06 07:14:52 crc kubenswrapper[4845]: I1006 07:14:52.769987 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x8ck\" (UniqueName: \"kubernetes.io/projected/88aa0357-70ea-4f4d-80e7-952615d772fe-kube-api-access-5x8ck\") pod \"88aa0357-70ea-4f4d-80e7-952615d772fe\" (UID: \"88aa0357-70ea-4f4d-80e7-952615d772fe\") " Oct 06 07:14:52 crc kubenswrapper[4845]: I1006 07:14:52.770094 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88aa0357-70ea-4f4d-80e7-952615d772fe-ssh-key\") pod \"88aa0357-70ea-4f4d-80e7-952615d772fe\" (UID: \"88aa0357-70ea-4f4d-80e7-952615d772fe\") " Oct 06 07:14:52 crc kubenswrapper[4845]: I1006 07:14:52.775449 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88aa0357-70ea-4f4d-80e7-952615d772fe-kube-api-access-5x8ck" (OuterVolumeSpecName: "kube-api-access-5x8ck") pod "88aa0357-70ea-4f4d-80e7-952615d772fe" (UID: "88aa0357-70ea-4f4d-80e7-952615d772fe"). InnerVolumeSpecName "kube-api-access-5x8ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:14:52 crc kubenswrapper[4845]: I1006 07:14:52.796648 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88aa0357-70ea-4f4d-80e7-952615d772fe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "88aa0357-70ea-4f4d-80e7-952615d772fe" (UID: "88aa0357-70ea-4f4d-80e7-952615d772fe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:14:52 crc kubenswrapper[4845]: I1006 07:14:52.800849 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88aa0357-70ea-4f4d-80e7-952615d772fe-inventory" (OuterVolumeSpecName: "inventory") pod "88aa0357-70ea-4f4d-80e7-952615d772fe" (UID: "88aa0357-70ea-4f4d-80e7-952615d772fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:14:52 crc kubenswrapper[4845]: I1006 07:14:52.872396 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88aa0357-70ea-4f4d-80e7-952615d772fe-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:52 crc kubenswrapper[4845]: I1006 07:14:52.872431 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x8ck\" (UniqueName: \"kubernetes.io/projected/88aa0357-70ea-4f4d-80e7-952615d772fe-kube-api-access-5x8ck\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:52 crc kubenswrapper[4845]: I1006 07:14:52.872442 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88aa0357-70ea-4f4d-80e7-952615d772fe-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.243650 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" event={"ID":"88aa0357-70ea-4f4d-80e7-952615d772fe","Type":"ContainerDied","Data":"5ad38ca9b60fc7d86577a65a0f5e4938f919765930ca50172f7c7826523b1688"} Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.243688 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ad38ca9b60fc7d86577a65a0f5e4938f919765930ca50172f7c7826523b1688" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.243733 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bcz6n" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.315243 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw"] Oct 06 07:14:53 crc kubenswrapper[4845]: E1006 07:14:53.315655 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88aa0357-70ea-4f4d-80e7-952615d772fe" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.315704 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="88aa0357-70ea-4f4d-80e7-952615d772fe" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.315961 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="88aa0357-70ea-4f4d-80e7-952615d772fe" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.316800 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.322062 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.322062 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.324753 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.326921 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.326944 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw"] Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.482068 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc1542af-e2ed-4aed-b0e9-0854b00c1320-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw\" (UID: \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.482196 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc1542af-e2ed-4aed-b0e9-0854b00c1320-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw\" (UID: \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.482241 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6kdr\" (UniqueName: \"kubernetes.io/projected/bc1542af-e2ed-4aed-b0e9-0854b00c1320-kube-api-access-b6kdr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw\" (UID: \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.583302 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc1542af-e2ed-4aed-b0e9-0854b00c1320-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw\" (UID: \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.583737 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc1542af-e2ed-4aed-b0e9-0854b00c1320-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw\" (UID: \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.583768 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6kdr\" (UniqueName: \"kubernetes.io/projected/bc1542af-e2ed-4aed-b0e9-0854b00c1320-kube-api-access-b6kdr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw\" (UID: \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.588540 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc1542af-e2ed-4aed-b0e9-0854b00c1320-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw\" (UID: \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.591214 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc1542af-e2ed-4aed-b0e9-0854b00c1320-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw\" (UID: \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.609667 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6kdr\" (UniqueName: \"kubernetes.io/projected/bc1542af-e2ed-4aed-b0e9-0854b00c1320-kube-api-access-b6kdr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw\" (UID: \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:14:53 crc kubenswrapper[4845]: I1006 07:14:53.646843 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:14:54 crc kubenswrapper[4845]: I1006 07:14:54.180865 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw"] Oct 06 07:14:54 crc kubenswrapper[4845]: W1006 07:14:54.192874 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc1542af_e2ed_4aed_b0e9_0854b00c1320.slice/crio-7eb0f6d50af46b41f14ebd4b7aa7c184b1c5276b47acdc60b7039631c9a930e0 WatchSource:0}: Error finding container 7eb0f6d50af46b41f14ebd4b7aa7c184b1c5276b47acdc60b7039631c9a930e0: Status 404 returned error can't find the container with id 7eb0f6d50af46b41f14ebd4b7aa7c184b1c5276b47acdc60b7039631c9a930e0 Oct 06 07:14:54 crc kubenswrapper[4845]: I1006 07:14:54.252804 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" event={"ID":"bc1542af-e2ed-4aed-b0e9-0854b00c1320","Type":"ContainerStarted","Data":"7eb0f6d50af46b41f14ebd4b7aa7c184b1c5276b47acdc60b7039631c9a930e0"} Oct 06 07:14:55 crc kubenswrapper[4845]: I1006 07:14:55.263674 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" event={"ID":"bc1542af-e2ed-4aed-b0e9-0854b00c1320","Type":"ContainerStarted","Data":"5b80e4d5386f129f471efd7ebd71b5816e41a88f570e18e30941f64892945a04"} Oct 06 07:14:55 crc kubenswrapper[4845]: I1006 07:14:55.281706 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" podStartSLOduration=1.729723292 podStartE2EDuration="2.281689969s" podCreationTimestamp="2025-10-06 07:14:53 +0000 UTC" firstStartedPulling="2025-10-06 07:14:54.204206355 +0000 UTC m=+1778.718947363" lastFinishedPulling="2025-10-06 07:14:54.756173012 +0000 UTC m=+1779.270914040" observedRunningTime="2025-10-06 07:14:55.280333335 +0000 UTC m=+1779.795074353" watchObservedRunningTime="2025-10-06 07:14:55.281689969 +0000 UTC m=+1779.796430987" Oct 06 07:14:57 crc kubenswrapper[4845]: I1006 07:14:57.227400 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:14:57 crc kubenswrapper[4845]: E1006 07:14:57.242010 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.137500 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw"] Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.139239 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.141347 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.143072 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.166005 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw"] Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.204839 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwlwx\" (UniqueName: \"kubernetes.io/projected/67ce5a29-47c2-4d60-bdb7-9177ec17345f-kube-api-access-zwlwx\") pod \"collect-profiles-29328915-n8mqw\" (UID: \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.204902 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67ce5a29-47c2-4d60-bdb7-9177ec17345f-secret-volume\") pod \"collect-profiles-29328915-n8mqw\" (UID: \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.204995 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67ce5a29-47c2-4d60-bdb7-9177ec17345f-config-volume\") pod \"collect-profiles-29328915-n8mqw\" (UID: \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.306146 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwlwx\" (UniqueName: \"kubernetes.io/projected/67ce5a29-47c2-4d60-bdb7-9177ec17345f-kube-api-access-zwlwx\") pod \"collect-profiles-29328915-n8mqw\" (UID: \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.306198 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67ce5a29-47c2-4d60-bdb7-9177ec17345f-secret-volume\") pod \"collect-profiles-29328915-n8mqw\" (UID: \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.306281 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67ce5a29-47c2-4d60-bdb7-9177ec17345f-config-volume\") pod \"collect-profiles-29328915-n8mqw\" (UID: \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.307482 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67ce5a29-47c2-4d60-bdb7-9177ec17345f-config-volume\") pod \"collect-profiles-29328915-n8mqw\" (UID: \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.312681 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67ce5a29-47c2-4d60-bdb7-9177ec17345f-secret-volume\") pod \"collect-profiles-29328915-n8mqw\" (UID: \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.330418 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwlwx\" (UniqueName: \"kubernetes.io/projected/67ce5a29-47c2-4d60-bdb7-9177ec17345f-kube-api-access-zwlwx\") pod \"collect-profiles-29328915-n8mqw\" (UID: \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.485883 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:00 crc kubenswrapper[4845]: I1006 07:15:00.920735 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw"] Oct 06 07:15:00 crc kubenswrapper[4845]: W1006 07:15:00.923911 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67ce5a29_47c2_4d60_bdb7_9177ec17345f.slice/crio-7d6858fe7c264c091db64fca13b090447632a2ac73fee427897967b011cd2076 WatchSource:0}: Error finding container 7d6858fe7c264c091db64fca13b090447632a2ac73fee427897967b011cd2076: Status 404 returned error can't find the container with id 7d6858fe7c264c091db64fca13b090447632a2ac73fee427897967b011cd2076 Oct 06 07:15:01 crc kubenswrapper[4845]: I1006 07:15:01.315461 4845 generic.go:334] "Generic (PLEG): container finished" podID="67ce5a29-47c2-4d60-bdb7-9177ec17345f" containerID="09c4d6e7c2afafc78f607beade71495528b30eab465f96ac4e84d1815ee41b47" exitCode=0 Oct 06 07:15:01 crc kubenswrapper[4845]: I1006 07:15:01.315540 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" event={"ID":"67ce5a29-47c2-4d60-bdb7-9177ec17345f","Type":"ContainerDied","Data":"09c4d6e7c2afafc78f607beade71495528b30eab465f96ac4e84d1815ee41b47"} Oct 06 07:15:01 crc kubenswrapper[4845]: I1006 07:15:01.315783 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" event={"ID":"67ce5a29-47c2-4d60-bdb7-9177ec17345f","Type":"ContainerStarted","Data":"7d6858fe7c264c091db64fca13b090447632a2ac73fee427897967b011cd2076"} Oct 06 07:15:02 crc kubenswrapper[4845]: I1006 07:15:02.608711 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:02 crc kubenswrapper[4845]: I1006 07:15:02.666822 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67ce5a29-47c2-4d60-bdb7-9177ec17345f-secret-volume\") pod \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\" (UID: \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\") " Oct 06 07:15:02 crc kubenswrapper[4845]: I1006 07:15:02.666861 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67ce5a29-47c2-4d60-bdb7-9177ec17345f-config-volume\") pod \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\" (UID: \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\") " Oct 06 07:15:02 crc kubenswrapper[4845]: I1006 07:15:02.666923 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwlwx\" (UniqueName: \"kubernetes.io/projected/67ce5a29-47c2-4d60-bdb7-9177ec17345f-kube-api-access-zwlwx\") pod \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\" (UID: \"67ce5a29-47c2-4d60-bdb7-9177ec17345f\") " Oct 06 07:15:02 crc kubenswrapper[4845]: I1006 07:15:02.667511 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ce5a29-47c2-4d60-bdb7-9177ec17345f-config-volume" (OuterVolumeSpecName: "config-volume") pod "67ce5a29-47c2-4d60-bdb7-9177ec17345f" (UID: "67ce5a29-47c2-4d60-bdb7-9177ec17345f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:15:02 crc kubenswrapper[4845]: I1006 07:15:02.673511 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67ce5a29-47c2-4d60-bdb7-9177ec17345f-kube-api-access-zwlwx" (OuterVolumeSpecName: "kube-api-access-zwlwx") pod "67ce5a29-47c2-4d60-bdb7-9177ec17345f" (UID: "67ce5a29-47c2-4d60-bdb7-9177ec17345f"). InnerVolumeSpecName "kube-api-access-zwlwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:15:02 crc kubenswrapper[4845]: I1006 07:15:02.673554 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ce5a29-47c2-4d60-bdb7-9177ec17345f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "67ce5a29-47c2-4d60-bdb7-9177ec17345f" (UID: "67ce5a29-47c2-4d60-bdb7-9177ec17345f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:15:02 crc kubenswrapper[4845]: I1006 07:15:02.768850 4845 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67ce5a29-47c2-4d60-bdb7-9177ec17345f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:02 crc kubenswrapper[4845]: I1006 07:15:02.768883 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67ce5a29-47c2-4d60-bdb7-9177ec17345f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:02 crc kubenswrapper[4845]: I1006 07:15:02.768894 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwlwx\" (UniqueName: \"kubernetes.io/projected/67ce5a29-47c2-4d60-bdb7-9177ec17345f-kube-api-access-zwlwx\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:03 crc kubenswrapper[4845]: I1006 07:15:03.333736 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" event={"ID":"67ce5a29-47c2-4d60-bdb7-9177ec17345f","Type":"ContainerDied","Data":"7d6858fe7c264c091db64fca13b090447632a2ac73fee427897967b011cd2076"} Oct 06 07:15:03 crc kubenswrapper[4845]: I1006 07:15:03.334269 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d6858fe7c264c091db64fca13b090447632a2ac73fee427897967b011cd2076" Oct 06 07:15:03 crc kubenswrapper[4845]: I1006 07:15:03.333794 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328915-n8mqw" Oct 06 07:15:04 crc kubenswrapper[4845]: I1006 07:15:04.349630 4845 generic.go:334] "Generic (PLEG): container finished" podID="bc1542af-e2ed-4aed-b0e9-0854b00c1320" containerID="5b80e4d5386f129f471efd7ebd71b5816e41a88f570e18e30941f64892945a04" exitCode=0 Oct 06 07:15:04 crc kubenswrapper[4845]: I1006 07:15:04.349718 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" event={"ID":"bc1542af-e2ed-4aed-b0e9-0854b00c1320","Type":"ContainerDied","Data":"5b80e4d5386f129f471efd7ebd71b5816e41a88f570e18e30941f64892945a04"} Oct 06 07:15:05 crc kubenswrapper[4845]: I1006 07:15:05.721202 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:15:05 crc kubenswrapper[4845]: I1006 07:15:05.843577 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6kdr\" (UniqueName: \"kubernetes.io/projected/bc1542af-e2ed-4aed-b0e9-0854b00c1320-kube-api-access-b6kdr\") pod \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\" (UID: \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\") " Oct 06 07:15:05 crc kubenswrapper[4845]: I1006 07:15:05.844006 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc1542af-e2ed-4aed-b0e9-0854b00c1320-ssh-key\") pod \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\" (UID: \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\") " Oct 06 07:15:05 crc kubenswrapper[4845]: I1006 07:15:05.844080 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc1542af-e2ed-4aed-b0e9-0854b00c1320-inventory\") pod \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\" (UID: \"bc1542af-e2ed-4aed-b0e9-0854b00c1320\") " Oct 06 07:15:05 crc kubenswrapper[4845]: I1006 07:15:05.848686 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1542af-e2ed-4aed-b0e9-0854b00c1320-kube-api-access-b6kdr" (OuterVolumeSpecName: "kube-api-access-b6kdr") pod "bc1542af-e2ed-4aed-b0e9-0854b00c1320" (UID: "bc1542af-e2ed-4aed-b0e9-0854b00c1320"). InnerVolumeSpecName "kube-api-access-b6kdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:15:05 crc kubenswrapper[4845]: I1006 07:15:05.870760 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1542af-e2ed-4aed-b0e9-0854b00c1320-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bc1542af-e2ed-4aed-b0e9-0854b00c1320" (UID: "bc1542af-e2ed-4aed-b0e9-0854b00c1320"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:15:05 crc kubenswrapper[4845]: I1006 07:15:05.871520 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1542af-e2ed-4aed-b0e9-0854b00c1320-inventory" (OuterVolumeSpecName: "inventory") pod "bc1542af-e2ed-4aed-b0e9-0854b00c1320" (UID: "bc1542af-e2ed-4aed-b0e9-0854b00c1320"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:15:05 crc kubenswrapper[4845]: I1006 07:15:05.945780 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6kdr\" (UniqueName: \"kubernetes.io/projected/bc1542af-e2ed-4aed-b0e9-0854b00c1320-kube-api-access-b6kdr\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:05 crc kubenswrapper[4845]: I1006 07:15:05.945818 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc1542af-e2ed-4aed-b0e9-0854b00c1320-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:05 crc kubenswrapper[4845]: I1006 07:15:05.945829 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc1542af-e2ed-4aed-b0e9-0854b00c1320-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.367655 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" event={"ID":"bc1542af-e2ed-4aed-b0e9-0854b00c1320","Type":"ContainerDied","Data":"7eb0f6d50af46b41f14ebd4b7aa7c184b1c5276b47acdc60b7039631c9a930e0"} Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.367700 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7eb0f6d50af46b41f14ebd4b7aa7c184b1c5276b47acdc60b7039631c9a930e0" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.367724 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.465367 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p"] Oct 06 07:15:06 crc kubenswrapper[4845]: E1006 07:15:06.465736 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ce5a29-47c2-4d60-bdb7-9177ec17345f" containerName="collect-profiles" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.465752 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ce5a29-47c2-4d60-bdb7-9177ec17345f" containerName="collect-profiles" Oct 06 07:15:06 crc kubenswrapper[4845]: E1006 07:15:06.465761 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1542af-e2ed-4aed-b0e9-0854b00c1320" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.465768 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1542af-e2ed-4aed-b0e9-0854b00c1320" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.466005 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ce5a29-47c2-4d60-bdb7-9177ec17345f" containerName="collect-profiles" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.466083 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1542af-e2ed-4aed-b0e9-0854b00c1320" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.466688 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.468733 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.471203 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.471488 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.471689 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.471875 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.472134 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.472135 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.472304 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.479045 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p"] Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.657767 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.657833 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.657864 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.657906 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c47cg\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-kube-api-access-c47cg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.657945 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.657972 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.658668 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.658746 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.658796 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.658834 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.658879 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.658947 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.658992 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.659047 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.760565 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.760634 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.760665 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.760700 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c47cg\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-kube-api-access-c47cg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.760745 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.760773 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.760796 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.760847 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.760887 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.760914 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.760939 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.760990 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.761013 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.761041 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.766277 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.766592 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.768733 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.768867 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.768864 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.768969 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.769043 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.769235 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.770026 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.770330 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.777794 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.790109 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.790474 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:06 crc kubenswrapper[4845]: I1006 07:15:06.792957 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c47cg\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-kube-api-access-c47cg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gm46p\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:07 crc kubenswrapper[4845]: I1006 07:15:07.085542 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:07 crc kubenswrapper[4845]: I1006 07:15:07.636766 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p"] Oct 06 07:15:08 crc kubenswrapper[4845]: I1006 07:15:08.385920 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" event={"ID":"705e6ad6-f299-43dc-8d30-7c0bd5039250","Type":"ContainerStarted","Data":"cb0f60548357b4ecdc27efcc031e036446fd2e34970948508b7cda854944f0a6"} Oct 06 07:15:09 crc kubenswrapper[4845]: I1006 07:15:09.395054 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" event={"ID":"705e6ad6-f299-43dc-8d30-7c0bd5039250","Type":"ContainerStarted","Data":"92d94c9a1aaa535387281128d801fc2ba203f518d220f7afb286fc02109121f3"} Oct 06 07:15:09 crc kubenswrapper[4845]: I1006 07:15:09.417710 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" podStartSLOduration=2.819376709 podStartE2EDuration="3.417691785s" podCreationTimestamp="2025-10-06 07:15:06 +0000 UTC" firstStartedPulling="2025-10-06 07:15:07.642816903 +0000 UTC m=+1792.157557921" lastFinishedPulling="2025-10-06 07:15:08.241131989 +0000 UTC m=+1792.755872997" observedRunningTime="2025-10-06 07:15:09.412947243 +0000 UTC m=+1793.927688271" watchObservedRunningTime="2025-10-06 07:15:09.417691785 +0000 UTC m=+1793.932432813" Oct 06 07:15:12 crc kubenswrapper[4845]: I1006 07:15:12.227775 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:15:12 crc kubenswrapper[4845]: E1006 07:15:12.228060 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:15:25 crc kubenswrapper[4845]: I1006 07:15:25.274343 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:15:25 crc kubenswrapper[4845]: I1006 07:15:25.527772 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"2c740b11eacd8eeb60421cfadeeedcffc611c0df3872f825986c7df388e5adde"} Oct 06 07:15:46 crc kubenswrapper[4845]: I1006 07:15:46.784516 4845 generic.go:334] "Generic (PLEG): container finished" podID="705e6ad6-f299-43dc-8d30-7c0bd5039250" containerID="92d94c9a1aaa535387281128d801fc2ba203f518d220f7afb286fc02109121f3" exitCode=0 Oct 06 07:15:46 crc kubenswrapper[4845]: I1006 07:15:46.784556 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" event={"ID":"705e6ad6-f299-43dc-8d30-7c0bd5039250","Type":"ContainerDied","Data":"92d94c9a1aaa535387281128d801fc2ba203f518d220f7afb286fc02109121f3"} Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.259316 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.403672 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.404000 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-ovn-combined-ca-bundle\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.404113 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.404217 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-ovn-default-certs-0\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.404301 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.404409 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c47cg\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-kube-api-access-c47cg\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.404521 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-bootstrap-combined-ca-bundle\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.404603 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-repo-setup-combined-ca-bundle\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.404705 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-nova-combined-ca-bundle\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.404820 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-inventory\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.404936 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-ssh-key\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.405010 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-neutron-metadata-combined-ca-bundle\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.405092 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-libvirt-combined-ca-bundle\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.405193 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-telemetry-combined-ca-bundle\") pod \"705e6ad6-f299-43dc-8d30-7c0bd5039250\" (UID: \"705e6ad6-f299-43dc-8d30-7c0bd5039250\") " Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.413110 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.413190 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.414312 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.414397 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-kube-api-access-c47cg" (OuterVolumeSpecName: "kube-api-access-c47cg") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "kube-api-access-c47cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.415398 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.415506 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.415525 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.415987 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.416649 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.417081 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.417577 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.426543 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.438118 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-inventory" (OuterVolumeSpecName: "inventory") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.441068 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "705e6ad6-f299-43dc-8d30-7c0bd5039250" (UID: "705e6ad6-f299-43dc-8d30-7c0bd5039250"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507458 4845 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507500 4845 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507515 4845 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507531 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507542 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507556 4845 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507574 4845 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507586 4845 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507600 4845 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507615 4845 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705e6ad6-f299-43dc-8d30-7c0bd5039250-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507628 4845 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507640 4845 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507653 4845 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.507666 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c47cg\" (UniqueName: \"kubernetes.io/projected/705e6ad6-f299-43dc-8d30-7c0bd5039250-kube-api-access-c47cg\") on node \"crc\" DevicePath \"\"" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.806875 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" event={"ID":"705e6ad6-f299-43dc-8d30-7c0bd5039250","Type":"ContainerDied","Data":"cb0f60548357b4ecdc27efcc031e036446fd2e34970948508b7cda854944f0a6"} Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.807207 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb0f60548357b4ecdc27efcc031e036446fd2e34970948508b7cda854944f0a6" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.807156 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gm46p" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.911563 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd"] Oct 06 07:15:48 crc kubenswrapper[4845]: E1006 07:15:48.912189 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="705e6ad6-f299-43dc-8d30-7c0bd5039250" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.912268 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="705e6ad6-f299-43dc-8d30-7c0bd5039250" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.912594 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="705e6ad6-f299-43dc-8d30-7c0bd5039250" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.913290 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.915856 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.916162 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.917051 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.917715 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.921221 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:15:48 crc kubenswrapper[4845]: I1006 07:15:48.924777 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd"] Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.016179 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.016260 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1428fc5-6ae1-4387-9635-69c26981be2a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.016285 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.016469 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.016627 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjxd5\" (UniqueName: \"kubernetes.io/projected/c1428fc5-6ae1-4387-9635-69c26981be2a-kube-api-access-kjxd5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.118364 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjxd5\" (UniqueName: \"kubernetes.io/projected/c1428fc5-6ae1-4387-9635-69c26981be2a-kube-api-access-kjxd5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.118483 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.118537 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1428fc5-6ae1-4387-9635-69c26981be2a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.118561 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.118628 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.120423 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1428fc5-6ae1-4387-9635-69c26981be2a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.122818 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.123297 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.132099 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.136805 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjxd5\" (UniqueName: \"kubernetes.io/projected/c1428fc5-6ae1-4387-9635-69c26981be2a-kube-api-access-kjxd5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fqfdd\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.274915 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.783299 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd"] Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.790760 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 07:15:49 crc kubenswrapper[4845]: I1006 07:15:49.816437 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" event={"ID":"c1428fc5-6ae1-4387-9635-69c26981be2a","Type":"ContainerStarted","Data":"fc5ea34bdbb0bdfb0d0815d5f63aded99385039e1484519063e050790ece83c3"} Oct 06 07:15:51 crc kubenswrapper[4845]: I1006 07:15:51.837011 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" event={"ID":"c1428fc5-6ae1-4387-9635-69c26981be2a","Type":"ContainerStarted","Data":"71bc84e0aae62394077d668bd9a4eee661725c447400313cb21fbe8645c517ac"} Oct 06 07:15:51 crc kubenswrapper[4845]: I1006 07:15:51.855157 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" podStartSLOduration=3.029141836 podStartE2EDuration="3.855138585s" podCreationTimestamp="2025-10-06 07:15:48 +0000 UTC" firstStartedPulling="2025-10-06 07:15:49.790495727 +0000 UTC m=+1834.305236735" lastFinishedPulling="2025-10-06 07:15:50.616492476 +0000 UTC m=+1835.131233484" observedRunningTime="2025-10-06 07:15:51.8502688 +0000 UTC m=+1836.365009818" watchObservedRunningTime="2025-10-06 07:15:51.855138585 +0000 UTC m=+1836.369879593" Oct 06 07:16:53 crc kubenswrapper[4845]: I1006 07:16:53.414432 4845 generic.go:334] "Generic (PLEG): container finished" podID="c1428fc5-6ae1-4387-9635-69c26981be2a" containerID="71bc84e0aae62394077d668bd9a4eee661725c447400313cb21fbe8645c517ac" exitCode=0 Oct 06 07:16:53 crc kubenswrapper[4845]: I1006 07:16:53.414515 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" event={"ID":"c1428fc5-6ae1-4387-9635-69c26981be2a","Type":"ContainerDied","Data":"71bc84e0aae62394077d668bd9a4eee661725c447400313cb21fbe8645c517ac"} Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.785092 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.885320 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1428fc5-6ae1-4387-9635-69c26981be2a-ovncontroller-config-0\") pod \"c1428fc5-6ae1-4387-9635-69c26981be2a\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.885407 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-inventory\") pod \"c1428fc5-6ae1-4387-9635-69c26981be2a\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.885458 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-ssh-key\") pod \"c1428fc5-6ae1-4387-9635-69c26981be2a\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.885566 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjxd5\" (UniqueName: \"kubernetes.io/projected/c1428fc5-6ae1-4387-9635-69c26981be2a-kube-api-access-kjxd5\") pod \"c1428fc5-6ae1-4387-9635-69c26981be2a\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.885667 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-ovn-combined-ca-bundle\") pod \"c1428fc5-6ae1-4387-9635-69c26981be2a\" (UID: \"c1428fc5-6ae1-4387-9635-69c26981be2a\") " Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.891284 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c1428fc5-6ae1-4387-9635-69c26981be2a" (UID: "c1428fc5-6ae1-4387-9635-69c26981be2a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.891658 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1428fc5-6ae1-4387-9635-69c26981be2a-kube-api-access-kjxd5" (OuterVolumeSpecName: "kube-api-access-kjxd5") pod "c1428fc5-6ae1-4387-9635-69c26981be2a" (UID: "c1428fc5-6ae1-4387-9635-69c26981be2a"). InnerVolumeSpecName "kube-api-access-kjxd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.910514 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1428fc5-6ae1-4387-9635-69c26981be2a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "c1428fc5-6ae1-4387-9635-69c26981be2a" (UID: "c1428fc5-6ae1-4387-9635-69c26981be2a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.914076 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c1428fc5-6ae1-4387-9635-69c26981be2a" (UID: "c1428fc5-6ae1-4387-9635-69c26981be2a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.914164 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-inventory" (OuterVolumeSpecName: "inventory") pod "c1428fc5-6ae1-4387-9635-69c26981be2a" (UID: "c1428fc5-6ae1-4387-9635-69c26981be2a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.988973 4845 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.989006 4845 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c1428fc5-6ae1-4387-9635-69c26981be2a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.989015 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.989023 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c1428fc5-6ae1-4387-9635-69c26981be2a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:16:54 crc kubenswrapper[4845]: I1006 07:16:54.989035 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjxd5\" (UniqueName: \"kubernetes.io/projected/c1428fc5-6ae1-4387-9635-69c26981be2a-kube-api-access-kjxd5\") on node \"crc\" DevicePath \"\"" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.431951 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" event={"ID":"c1428fc5-6ae1-4387-9635-69c26981be2a","Type":"ContainerDied","Data":"fc5ea34bdbb0bdfb0d0815d5f63aded99385039e1484519063e050790ece83c3"} Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.432005 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc5ea34bdbb0bdfb0d0815d5f63aded99385039e1484519063e050790ece83c3" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.432007 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fqfdd" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.512279 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh"] Oct 06 07:16:55 crc kubenswrapper[4845]: E1006 07:16:55.512791 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1428fc5-6ae1-4387-9635-69c26981be2a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.512816 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1428fc5-6ae1-4387-9635-69c26981be2a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.513046 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1428fc5-6ae1-4387-9635-69c26981be2a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.513920 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.516993 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.517088 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.517243 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.517414 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.517545 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.517654 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.524890 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh"] Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.599587 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.599688 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.599721 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqp88\" (UniqueName: \"kubernetes.io/projected/60f68944-a123-4f0a-ba3f-8215bf68a123-kube-api-access-zqp88\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.600139 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.600244 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.600312 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.702634 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.702729 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.702770 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.702795 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqp88\" (UniqueName: \"kubernetes.io/projected/60f68944-a123-4f0a-ba3f-8215bf68a123-kube-api-access-zqp88\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.702924 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.702963 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.707798 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.707805 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.708378 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.708694 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.709121 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.720937 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqp88\" (UniqueName: \"kubernetes.io/projected/60f68944-a123-4f0a-ba3f-8215bf68a123-kube-api-access-zqp88\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:55 crc kubenswrapper[4845]: I1006 07:16:55.832637 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:16:56 crc kubenswrapper[4845]: I1006 07:16:56.319047 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh"] Oct 06 07:16:56 crc kubenswrapper[4845]: I1006 07:16:56.440935 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" event={"ID":"60f68944-a123-4f0a-ba3f-8215bf68a123","Type":"ContainerStarted","Data":"72f6471b80cc43cd7f99ec5b830fa12f1652961a5f295fe7b78ad39b9b42c8db"} Oct 06 07:16:57 crc kubenswrapper[4845]: I1006 07:16:57.456703 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" event={"ID":"60f68944-a123-4f0a-ba3f-8215bf68a123","Type":"ContainerStarted","Data":"9fd481ec0bd6a218fb3296297cb6cd924f0f169f1c3a7385e82db6603cb2ac3b"} Oct 06 07:16:57 crc kubenswrapper[4845]: I1006 07:16:57.474470 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" podStartSLOduration=1.951321013 podStartE2EDuration="2.47445361s" podCreationTimestamp="2025-10-06 07:16:55 +0000 UTC" firstStartedPulling="2025-10-06 07:16:56.329025242 +0000 UTC m=+1900.843766260" lastFinishedPulling="2025-10-06 07:16:56.852157849 +0000 UTC m=+1901.366898857" observedRunningTime="2025-10-06 07:16:57.471733361 +0000 UTC m=+1901.986474389" watchObservedRunningTime="2025-10-06 07:16:57.47445361 +0000 UTC m=+1901.989194618" Oct 06 07:17:45 crc kubenswrapper[4845]: I1006 07:17:45.918178 4845 generic.go:334] "Generic (PLEG): container finished" podID="60f68944-a123-4f0a-ba3f-8215bf68a123" containerID="9fd481ec0bd6a218fb3296297cb6cd924f0f169f1c3a7385e82db6603cb2ac3b" exitCode=0 Oct 06 07:17:45 crc kubenswrapper[4845]: I1006 07:17:45.918280 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" event={"ID":"60f68944-a123-4f0a-ba3f-8215bf68a123","Type":"ContainerDied","Data":"9fd481ec0bd6a218fb3296297cb6cd924f0f169f1c3a7385e82db6603cb2ac3b"} Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.426284 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.490974 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-neutron-metadata-combined-ca-bundle\") pod \"60f68944-a123-4f0a-ba3f-8215bf68a123\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.491059 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-ssh-key\") pod \"60f68944-a123-4f0a-ba3f-8215bf68a123\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.491156 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-nova-metadata-neutron-config-0\") pod \"60f68944-a123-4f0a-ba3f-8215bf68a123\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.491290 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-inventory\") pod \"60f68944-a123-4f0a-ba3f-8215bf68a123\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.491310 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-neutron-ovn-metadata-agent-neutron-config-0\") pod \"60f68944-a123-4f0a-ba3f-8215bf68a123\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.491331 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqp88\" (UniqueName: \"kubernetes.io/projected/60f68944-a123-4f0a-ba3f-8215bf68a123-kube-api-access-zqp88\") pod \"60f68944-a123-4f0a-ba3f-8215bf68a123\" (UID: \"60f68944-a123-4f0a-ba3f-8215bf68a123\") " Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.498217 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f68944-a123-4f0a-ba3f-8215bf68a123-kube-api-access-zqp88" (OuterVolumeSpecName: "kube-api-access-zqp88") pod "60f68944-a123-4f0a-ba3f-8215bf68a123" (UID: "60f68944-a123-4f0a-ba3f-8215bf68a123"). InnerVolumeSpecName "kube-api-access-zqp88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.500218 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "60f68944-a123-4f0a-ba3f-8215bf68a123" (UID: "60f68944-a123-4f0a-ba3f-8215bf68a123"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.522983 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-inventory" (OuterVolumeSpecName: "inventory") pod "60f68944-a123-4f0a-ba3f-8215bf68a123" (UID: "60f68944-a123-4f0a-ba3f-8215bf68a123"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.525183 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "60f68944-a123-4f0a-ba3f-8215bf68a123" (UID: "60f68944-a123-4f0a-ba3f-8215bf68a123"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.528983 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "60f68944-a123-4f0a-ba3f-8215bf68a123" (UID: "60f68944-a123-4f0a-ba3f-8215bf68a123"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.529574 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "60f68944-a123-4f0a-ba3f-8215bf68a123" (UID: "60f68944-a123-4f0a-ba3f-8215bf68a123"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.593103 4845 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.593148 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.593160 4845 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.593172 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqp88\" (UniqueName: \"kubernetes.io/projected/60f68944-a123-4f0a-ba3f-8215bf68a123-kube-api-access-zqp88\") on node \"crc\" DevicePath \"\"" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.593182 4845 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.593192 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60f68944-a123-4f0a-ba3f-8215bf68a123-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.936792 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" event={"ID":"60f68944-a123-4f0a-ba3f-8215bf68a123","Type":"ContainerDied","Data":"72f6471b80cc43cd7f99ec5b830fa12f1652961a5f295fe7b78ad39b9b42c8db"} Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.936834 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f6471b80cc43cd7f99ec5b830fa12f1652961a5f295fe7b78ad39b9b42c8db" Oct 06 07:17:47 crc kubenswrapper[4845]: I1006 07:17:47.936860 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.071474 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v"] Oct 06 07:17:48 crc kubenswrapper[4845]: E1006 07:17:48.072127 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f68944-a123-4f0a-ba3f-8215bf68a123" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.072154 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f68944-a123-4f0a-ba3f-8215bf68a123" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.072411 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="60f68944-a123-4f0a-ba3f-8215bf68a123" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.073528 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.078879 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.078986 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.079081 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.079131 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.081551 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.083315 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v"] Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.224053 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.224171 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.224516 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.224617 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxwh4\" (UniqueName: \"kubernetes.io/projected/50be375c-cf6d-4540-930c-e09f602c4045-kube-api-access-lxwh4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.224785 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.327698 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.327789 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxwh4\" (UniqueName: \"kubernetes.io/projected/50be375c-cf6d-4540-930c-e09f602c4045-kube-api-access-lxwh4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.327815 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.327897 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.327921 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.334559 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.334748 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.334998 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.335243 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.347783 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxwh4\" (UniqueName: \"kubernetes.io/projected/50be375c-cf6d-4540-930c-e09f602c4045-kube-api-access-lxwh4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kj28v\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.406057 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:17:48 crc kubenswrapper[4845]: I1006 07:17:48.942346 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v"] Oct 06 07:17:49 crc kubenswrapper[4845]: I1006 07:17:49.959009 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" event={"ID":"50be375c-cf6d-4540-930c-e09f602c4045","Type":"ContainerStarted","Data":"f4c378037a4d780a139118e9ff5cd0136b89d47df15d2821a53437ac9cba36e5"} Oct 06 07:17:49 crc kubenswrapper[4845]: I1006 07:17:49.959066 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" event={"ID":"50be375c-cf6d-4540-930c-e09f602c4045","Type":"ContainerStarted","Data":"f7cb9d45a982befa5a95691e6674e7b63cd60d4bac13ebf9a3ead2fab3c51329"} Oct 06 07:17:49 crc kubenswrapper[4845]: I1006 07:17:49.984737 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" podStartSLOduration=1.368537949 podStartE2EDuration="1.984714421s" podCreationTimestamp="2025-10-06 07:17:48 +0000 UTC" firstStartedPulling="2025-10-06 07:17:48.952409881 +0000 UTC m=+1953.467150889" lastFinishedPulling="2025-10-06 07:17:49.568586363 +0000 UTC m=+1954.083327361" observedRunningTime="2025-10-06 07:17:49.976178904 +0000 UTC m=+1954.490919932" watchObservedRunningTime="2025-10-06 07:17:49.984714421 +0000 UTC m=+1954.499455429" Oct 06 07:17:53 crc kubenswrapper[4845]: I1006 07:17:53.019094 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:17:53 crc kubenswrapper[4845]: I1006 07:17:53.019725 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:18:23 crc kubenswrapper[4845]: I1006 07:18:23.019032 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:18:23 crc kubenswrapper[4845]: I1006 07:18:23.019625 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.621105 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xfz5z"] Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.625771 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.636994 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xfz5z"] Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.688163 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e1183dd-101a-4550-827f-9a3b2a55edd8-catalog-content\") pod \"certified-operators-xfz5z\" (UID: \"3e1183dd-101a-4550-827f-9a3b2a55edd8\") " pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.688697 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26lh4\" (UniqueName: \"kubernetes.io/projected/3e1183dd-101a-4550-827f-9a3b2a55edd8-kube-api-access-26lh4\") pod \"certified-operators-xfz5z\" (UID: \"3e1183dd-101a-4550-827f-9a3b2a55edd8\") " pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.689022 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e1183dd-101a-4550-827f-9a3b2a55edd8-utilities\") pod \"certified-operators-xfz5z\" (UID: \"3e1183dd-101a-4550-827f-9a3b2a55edd8\") " pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.800927 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26lh4\" (UniqueName: \"kubernetes.io/projected/3e1183dd-101a-4550-827f-9a3b2a55edd8-kube-api-access-26lh4\") pod \"certified-operators-xfz5z\" (UID: \"3e1183dd-101a-4550-827f-9a3b2a55edd8\") " pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.801073 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e1183dd-101a-4550-827f-9a3b2a55edd8-utilities\") pod \"certified-operators-xfz5z\" (UID: \"3e1183dd-101a-4550-827f-9a3b2a55edd8\") " pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.801171 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e1183dd-101a-4550-827f-9a3b2a55edd8-catalog-content\") pod \"certified-operators-xfz5z\" (UID: \"3e1183dd-101a-4550-827f-9a3b2a55edd8\") " pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.801821 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e1183dd-101a-4550-827f-9a3b2a55edd8-catalog-content\") pod \"certified-operators-xfz5z\" (UID: \"3e1183dd-101a-4550-827f-9a3b2a55edd8\") " pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.802621 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e1183dd-101a-4550-827f-9a3b2a55edd8-utilities\") pod \"certified-operators-xfz5z\" (UID: \"3e1183dd-101a-4550-827f-9a3b2a55edd8\") " pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.820492 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26lh4\" (UniqueName: \"kubernetes.io/projected/3e1183dd-101a-4550-827f-9a3b2a55edd8-kube-api-access-26lh4\") pod \"certified-operators-xfz5z\" (UID: \"3e1183dd-101a-4550-827f-9a3b2a55edd8\") " pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:35 crc kubenswrapper[4845]: I1006 07:18:35.968347 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:36 crc kubenswrapper[4845]: I1006 07:18:36.457493 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xfz5z"] Oct 06 07:18:37 crc kubenswrapper[4845]: I1006 07:18:37.399499 4845 generic.go:334] "Generic (PLEG): container finished" podID="3e1183dd-101a-4550-827f-9a3b2a55edd8" containerID="5fd6d2880d257eee92994eb46d9c5ea7d90af636bffaf8ede127178a1b58baee" exitCode=0 Oct 06 07:18:37 crc kubenswrapper[4845]: I1006 07:18:37.399546 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfz5z" event={"ID":"3e1183dd-101a-4550-827f-9a3b2a55edd8","Type":"ContainerDied","Data":"5fd6d2880d257eee92994eb46d9c5ea7d90af636bffaf8ede127178a1b58baee"} Oct 06 07:18:37 crc kubenswrapper[4845]: I1006 07:18:37.399572 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfz5z" event={"ID":"3e1183dd-101a-4550-827f-9a3b2a55edd8","Type":"ContainerStarted","Data":"ebb4932d3629c387e1f8fc3f8f92714402c519f001c54df55fc24ab24a8e5915"} Oct 06 07:18:39 crc kubenswrapper[4845]: I1006 07:18:39.416877 4845 generic.go:334] "Generic (PLEG): container finished" podID="3e1183dd-101a-4550-827f-9a3b2a55edd8" containerID="f61dc244f4959c792b5e043795c07c33b21ba900a5aa60c2617d62ccdecdf590" exitCode=0 Oct 06 07:18:39 crc kubenswrapper[4845]: I1006 07:18:39.417783 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfz5z" event={"ID":"3e1183dd-101a-4550-827f-9a3b2a55edd8","Type":"ContainerDied","Data":"f61dc244f4959c792b5e043795c07c33b21ba900a5aa60c2617d62ccdecdf590"} Oct 06 07:18:40 crc kubenswrapper[4845]: I1006 07:18:40.429182 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfz5z" event={"ID":"3e1183dd-101a-4550-827f-9a3b2a55edd8","Type":"ContainerStarted","Data":"f629551a65aec81ea95a00f2e7f032edbbbad866d0beda088587834410d83575"} Oct 06 07:18:40 crc kubenswrapper[4845]: I1006 07:18:40.464398 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xfz5z" podStartSLOduration=2.837051557 podStartE2EDuration="5.464364974s" podCreationTimestamp="2025-10-06 07:18:35 +0000 UTC" firstStartedPulling="2025-10-06 07:18:37.402077022 +0000 UTC m=+2001.916818030" lastFinishedPulling="2025-10-06 07:18:40.029390439 +0000 UTC m=+2004.544131447" observedRunningTime="2025-10-06 07:18:40.444571772 +0000 UTC m=+2004.959312780" watchObservedRunningTime="2025-10-06 07:18:40.464364974 +0000 UTC m=+2004.979105982" Oct 06 07:18:45 crc kubenswrapper[4845]: I1006 07:18:45.968517 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:45 crc kubenswrapper[4845]: I1006 07:18:45.969093 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:46 crc kubenswrapper[4845]: I1006 07:18:46.024125 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:46 crc kubenswrapper[4845]: I1006 07:18:46.530652 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:48 crc kubenswrapper[4845]: I1006 07:18:48.536421 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xfz5z"] Oct 06 07:18:48 crc kubenswrapper[4845]: I1006 07:18:48.537057 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xfz5z" podUID="3e1183dd-101a-4550-827f-9a3b2a55edd8" containerName="registry-server" containerID="cri-o://f629551a65aec81ea95a00f2e7f032edbbbad866d0beda088587834410d83575" gracePeriod=2 Oct 06 07:18:49 crc kubenswrapper[4845]: I1006 07:18:49.503594 4845 generic.go:334] "Generic (PLEG): container finished" podID="3e1183dd-101a-4550-827f-9a3b2a55edd8" containerID="f629551a65aec81ea95a00f2e7f032edbbbad866d0beda088587834410d83575" exitCode=0 Oct 06 07:18:49 crc kubenswrapper[4845]: I1006 07:18:49.503638 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfz5z" event={"ID":"3e1183dd-101a-4550-827f-9a3b2a55edd8","Type":"ContainerDied","Data":"f629551a65aec81ea95a00f2e7f032edbbbad866d0beda088587834410d83575"} Oct 06 07:18:49 crc kubenswrapper[4845]: I1006 07:18:49.957892 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.080598 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e1183dd-101a-4550-827f-9a3b2a55edd8-utilities\") pod \"3e1183dd-101a-4550-827f-9a3b2a55edd8\" (UID: \"3e1183dd-101a-4550-827f-9a3b2a55edd8\") " Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.080698 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26lh4\" (UniqueName: \"kubernetes.io/projected/3e1183dd-101a-4550-827f-9a3b2a55edd8-kube-api-access-26lh4\") pod \"3e1183dd-101a-4550-827f-9a3b2a55edd8\" (UID: \"3e1183dd-101a-4550-827f-9a3b2a55edd8\") " Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.080768 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e1183dd-101a-4550-827f-9a3b2a55edd8-catalog-content\") pod \"3e1183dd-101a-4550-827f-9a3b2a55edd8\" (UID: \"3e1183dd-101a-4550-827f-9a3b2a55edd8\") " Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.082791 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e1183dd-101a-4550-827f-9a3b2a55edd8-utilities" (OuterVolumeSpecName: "utilities") pod "3e1183dd-101a-4550-827f-9a3b2a55edd8" (UID: "3e1183dd-101a-4550-827f-9a3b2a55edd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.092228 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e1183dd-101a-4550-827f-9a3b2a55edd8-kube-api-access-26lh4" (OuterVolumeSpecName: "kube-api-access-26lh4") pod "3e1183dd-101a-4550-827f-9a3b2a55edd8" (UID: "3e1183dd-101a-4550-827f-9a3b2a55edd8"). InnerVolumeSpecName "kube-api-access-26lh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.134700 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e1183dd-101a-4550-827f-9a3b2a55edd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e1183dd-101a-4550-827f-9a3b2a55edd8" (UID: "3e1183dd-101a-4550-827f-9a3b2a55edd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.183228 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e1183dd-101a-4550-827f-9a3b2a55edd8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.183264 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e1183dd-101a-4550-827f-9a3b2a55edd8-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.183273 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26lh4\" (UniqueName: \"kubernetes.io/projected/3e1183dd-101a-4550-827f-9a3b2a55edd8-kube-api-access-26lh4\") on node \"crc\" DevicePath \"\"" Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.517961 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfz5z" event={"ID":"3e1183dd-101a-4550-827f-9a3b2a55edd8","Type":"ContainerDied","Data":"ebb4932d3629c387e1f8fc3f8f92714402c519f001c54df55fc24ab24a8e5915"} Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.518821 4845 scope.go:117] "RemoveContainer" containerID="f629551a65aec81ea95a00f2e7f032edbbbad866d0beda088587834410d83575" Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.518554 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xfz5z" Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.544109 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xfz5z"] Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.546417 4845 scope.go:117] "RemoveContainer" containerID="f61dc244f4959c792b5e043795c07c33b21ba900a5aa60c2617d62ccdecdf590" Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.552912 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xfz5z"] Oct 06 07:18:50 crc kubenswrapper[4845]: I1006 07:18:50.569405 4845 scope.go:117] "RemoveContainer" containerID="5fd6d2880d257eee92994eb46d9c5ea7d90af636bffaf8ede127178a1b58baee" Oct 06 07:18:52 crc kubenswrapper[4845]: I1006 07:18:52.238174 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e1183dd-101a-4550-827f-9a3b2a55edd8" path="/var/lib/kubelet/pods/3e1183dd-101a-4550-827f-9a3b2a55edd8/volumes" Oct 06 07:18:53 crc kubenswrapper[4845]: I1006 07:18:53.018736 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:18:53 crc kubenswrapper[4845]: I1006 07:18:53.018811 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:18:53 crc kubenswrapper[4845]: I1006 07:18:53.018869 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 07:18:53 crc kubenswrapper[4845]: I1006 07:18:53.019655 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c740b11eacd8eeb60421cfadeeedcffc611c0df3872f825986c7df388e5adde"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 07:18:53 crc kubenswrapper[4845]: I1006 07:18:53.019719 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://2c740b11eacd8eeb60421cfadeeedcffc611c0df3872f825986c7df388e5adde" gracePeriod=600 Oct 06 07:18:53 crc kubenswrapper[4845]: I1006 07:18:53.554151 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="2c740b11eacd8eeb60421cfadeeedcffc611c0df3872f825986c7df388e5adde" exitCode=0 Oct 06 07:18:53 crc kubenswrapper[4845]: I1006 07:18:53.554232 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"2c740b11eacd8eeb60421cfadeeedcffc611c0df3872f825986c7df388e5adde"} Oct 06 07:18:53 crc kubenswrapper[4845]: I1006 07:18:53.554565 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b"} Oct 06 07:18:53 crc kubenswrapper[4845]: I1006 07:18:53.554584 4845 scope.go:117] "RemoveContainer" containerID="49d17f779f25e89e98dfe2291d598d38006c6309933457d5621638174b95bad9" Oct 06 07:20:53 crc kubenswrapper[4845]: I1006 07:20:53.018789 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:20:53 crc kubenswrapper[4845]: I1006 07:20:53.019407 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.373652 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4lrgh"] Oct 06 07:21:01 crc kubenswrapper[4845]: E1006 07:21:01.375604 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1183dd-101a-4550-827f-9a3b2a55edd8" containerName="extract-content" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.375623 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1183dd-101a-4550-827f-9a3b2a55edd8" containerName="extract-content" Oct 06 07:21:01 crc kubenswrapper[4845]: E1006 07:21:01.375665 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1183dd-101a-4550-827f-9a3b2a55edd8" containerName="extract-utilities" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.375673 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1183dd-101a-4550-827f-9a3b2a55edd8" containerName="extract-utilities" Oct 06 07:21:01 crc kubenswrapper[4845]: E1006 07:21:01.375693 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1183dd-101a-4550-827f-9a3b2a55edd8" containerName="registry-server" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.375701 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1183dd-101a-4550-827f-9a3b2a55edd8" containerName="registry-server" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.375969 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e1183dd-101a-4550-827f-9a3b2a55edd8" containerName="registry-server" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.377993 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.388316 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4lrgh"] Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.548907 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1188b365-16b7-48b5-850e-93fb6c9ec813-catalog-content\") pod \"community-operators-4lrgh\" (UID: \"1188b365-16b7-48b5-850e-93fb6c9ec813\") " pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.548984 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m24rt\" (UniqueName: \"kubernetes.io/projected/1188b365-16b7-48b5-850e-93fb6c9ec813-kube-api-access-m24rt\") pod \"community-operators-4lrgh\" (UID: \"1188b365-16b7-48b5-850e-93fb6c9ec813\") " pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.549013 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1188b365-16b7-48b5-850e-93fb6c9ec813-utilities\") pod \"community-operators-4lrgh\" (UID: \"1188b365-16b7-48b5-850e-93fb6c9ec813\") " pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.651126 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1188b365-16b7-48b5-850e-93fb6c9ec813-catalog-content\") pod \"community-operators-4lrgh\" (UID: \"1188b365-16b7-48b5-850e-93fb6c9ec813\") " pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.651195 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m24rt\" (UniqueName: \"kubernetes.io/projected/1188b365-16b7-48b5-850e-93fb6c9ec813-kube-api-access-m24rt\") pod \"community-operators-4lrgh\" (UID: \"1188b365-16b7-48b5-850e-93fb6c9ec813\") " pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.651225 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1188b365-16b7-48b5-850e-93fb6c9ec813-utilities\") pod \"community-operators-4lrgh\" (UID: \"1188b365-16b7-48b5-850e-93fb6c9ec813\") " pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.651757 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1188b365-16b7-48b5-850e-93fb6c9ec813-catalog-content\") pod \"community-operators-4lrgh\" (UID: \"1188b365-16b7-48b5-850e-93fb6c9ec813\") " pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.651832 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1188b365-16b7-48b5-850e-93fb6c9ec813-utilities\") pod \"community-operators-4lrgh\" (UID: \"1188b365-16b7-48b5-850e-93fb6c9ec813\") " pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.673843 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m24rt\" (UniqueName: \"kubernetes.io/projected/1188b365-16b7-48b5-850e-93fb6c9ec813-kube-api-access-m24rt\") pod \"community-operators-4lrgh\" (UID: \"1188b365-16b7-48b5-850e-93fb6c9ec813\") " pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:01 crc kubenswrapper[4845]: I1006 07:21:01.746661 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:02 crc kubenswrapper[4845]: W1006 07:21:02.234081 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1188b365_16b7_48b5_850e_93fb6c9ec813.slice/crio-95fe6d21ae654f1813293815c6033f8b816b99e1c614995403ec47fd26f819cc WatchSource:0}: Error finding container 95fe6d21ae654f1813293815c6033f8b816b99e1c614995403ec47fd26f819cc: Status 404 returned error can't find the container with id 95fe6d21ae654f1813293815c6033f8b816b99e1c614995403ec47fd26f819cc Oct 06 07:21:02 crc kubenswrapper[4845]: I1006 07:21:02.241943 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4lrgh"] Oct 06 07:21:02 crc kubenswrapper[4845]: I1006 07:21:02.745830 4845 generic.go:334] "Generic (PLEG): container finished" podID="1188b365-16b7-48b5-850e-93fb6c9ec813" containerID="954d92ce2394d3e59916685194d9057df8f1b01ae03a0aed0931450c64ac0f91" exitCode=0 Oct 06 07:21:02 crc kubenswrapper[4845]: I1006 07:21:02.745893 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lrgh" event={"ID":"1188b365-16b7-48b5-850e-93fb6c9ec813","Type":"ContainerDied","Data":"954d92ce2394d3e59916685194d9057df8f1b01ae03a0aed0931450c64ac0f91"} Oct 06 07:21:02 crc kubenswrapper[4845]: I1006 07:21:02.745924 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lrgh" event={"ID":"1188b365-16b7-48b5-850e-93fb6c9ec813","Type":"ContainerStarted","Data":"95fe6d21ae654f1813293815c6033f8b816b99e1c614995403ec47fd26f819cc"} Oct 06 07:21:02 crc kubenswrapper[4845]: I1006 07:21:02.748437 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 07:21:03 crc kubenswrapper[4845]: I1006 07:21:03.755848 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lrgh" event={"ID":"1188b365-16b7-48b5-850e-93fb6c9ec813","Type":"ContainerStarted","Data":"13165fdd9d9b1843d4838b6d73d62d62c0210520c78caadcd4fd10b39d9bc813"} Oct 06 07:21:04 crc kubenswrapper[4845]: I1006 07:21:04.764968 4845 generic.go:334] "Generic (PLEG): container finished" podID="1188b365-16b7-48b5-850e-93fb6c9ec813" containerID="13165fdd9d9b1843d4838b6d73d62d62c0210520c78caadcd4fd10b39d9bc813" exitCode=0 Oct 06 07:21:04 crc kubenswrapper[4845]: I1006 07:21:04.765052 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lrgh" event={"ID":"1188b365-16b7-48b5-850e-93fb6c9ec813","Type":"ContainerDied","Data":"13165fdd9d9b1843d4838b6d73d62d62c0210520c78caadcd4fd10b39d9bc813"} Oct 06 07:21:05 crc kubenswrapper[4845]: I1006 07:21:05.774522 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lrgh" event={"ID":"1188b365-16b7-48b5-850e-93fb6c9ec813","Type":"ContainerStarted","Data":"f9b6917327e5f76156e638066745b868e31825ea80cf8813406df5cfdd2bd791"} Oct 06 07:21:05 crc kubenswrapper[4845]: I1006 07:21:05.793108 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4lrgh" podStartSLOduration=2.104963303 podStartE2EDuration="4.793086843s" podCreationTimestamp="2025-10-06 07:21:01 +0000 UTC" firstStartedPulling="2025-10-06 07:21:02.74815439 +0000 UTC m=+2147.262895398" lastFinishedPulling="2025-10-06 07:21:05.43627793 +0000 UTC m=+2149.951018938" observedRunningTime="2025-10-06 07:21:05.790472557 +0000 UTC m=+2150.305213585" watchObservedRunningTime="2025-10-06 07:21:05.793086843 +0000 UTC m=+2150.307827851" Oct 06 07:21:11 crc kubenswrapper[4845]: I1006 07:21:11.747331 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:11 crc kubenswrapper[4845]: I1006 07:21:11.748935 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:11 crc kubenswrapper[4845]: I1006 07:21:11.820991 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:11 crc kubenswrapper[4845]: I1006 07:21:11.887604 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:12 crc kubenswrapper[4845]: I1006 07:21:12.063006 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4lrgh"] Oct 06 07:21:13 crc kubenswrapper[4845]: I1006 07:21:13.847620 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4lrgh" podUID="1188b365-16b7-48b5-850e-93fb6c9ec813" containerName="registry-server" containerID="cri-o://f9b6917327e5f76156e638066745b868e31825ea80cf8813406df5cfdd2bd791" gracePeriod=2 Oct 06 07:21:14 crc kubenswrapper[4845]: I1006 07:21:14.860028 4845 generic.go:334] "Generic (PLEG): container finished" podID="1188b365-16b7-48b5-850e-93fb6c9ec813" containerID="f9b6917327e5f76156e638066745b868e31825ea80cf8813406df5cfdd2bd791" exitCode=0 Oct 06 07:21:14 crc kubenswrapper[4845]: I1006 07:21:14.860321 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lrgh" event={"ID":"1188b365-16b7-48b5-850e-93fb6c9ec813","Type":"ContainerDied","Data":"f9b6917327e5f76156e638066745b868e31825ea80cf8813406df5cfdd2bd791"} Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.335435 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.451833 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1188b365-16b7-48b5-850e-93fb6c9ec813-utilities\") pod \"1188b365-16b7-48b5-850e-93fb6c9ec813\" (UID: \"1188b365-16b7-48b5-850e-93fb6c9ec813\") " Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.451912 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1188b365-16b7-48b5-850e-93fb6c9ec813-catalog-content\") pod \"1188b365-16b7-48b5-850e-93fb6c9ec813\" (UID: \"1188b365-16b7-48b5-850e-93fb6c9ec813\") " Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.452029 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m24rt\" (UniqueName: \"kubernetes.io/projected/1188b365-16b7-48b5-850e-93fb6c9ec813-kube-api-access-m24rt\") pod \"1188b365-16b7-48b5-850e-93fb6c9ec813\" (UID: \"1188b365-16b7-48b5-850e-93fb6c9ec813\") " Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.452956 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1188b365-16b7-48b5-850e-93fb6c9ec813-utilities" (OuterVolumeSpecName: "utilities") pod "1188b365-16b7-48b5-850e-93fb6c9ec813" (UID: "1188b365-16b7-48b5-850e-93fb6c9ec813"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.458041 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1188b365-16b7-48b5-850e-93fb6c9ec813-kube-api-access-m24rt" (OuterVolumeSpecName: "kube-api-access-m24rt") pod "1188b365-16b7-48b5-850e-93fb6c9ec813" (UID: "1188b365-16b7-48b5-850e-93fb6c9ec813"). InnerVolumeSpecName "kube-api-access-m24rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.503065 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1188b365-16b7-48b5-850e-93fb6c9ec813-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1188b365-16b7-48b5-850e-93fb6c9ec813" (UID: "1188b365-16b7-48b5-850e-93fb6c9ec813"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.554552 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1188b365-16b7-48b5-850e-93fb6c9ec813-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.554588 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1188b365-16b7-48b5-850e-93fb6c9ec813-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.554600 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m24rt\" (UniqueName: \"kubernetes.io/projected/1188b365-16b7-48b5-850e-93fb6c9ec813-kube-api-access-m24rt\") on node \"crc\" DevicePath \"\"" Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.874212 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lrgh" event={"ID":"1188b365-16b7-48b5-850e-93fb6c9ec813","Type":"ContainerDied","Data":"95fe6d21ae654f1813293815c6033f8b816b99e1c614995403ec47fd26f819cc"} Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.874275 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4lrgh" Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.874292 4845 scope.go:117] "RemoveContainer" containerID="f9b6917327e5f76156e638066745b868e31825ea80cf8813406df5cfdd2bd791" Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.932463 4845 scope.go:117] "RemoveContainer" containerID="13165fdd9d9b1843d4838b6d73d62d62c0210520c78caadcd4fd10b39d9bc813" Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.934716 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4lrgh"] Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.948305 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4lrgh"] Oct 06 07:21:15 crc kubenswrapper[4845]: I1006 07:21:15.963223 4845 scope.go:117] "RemoveContainer" containerID="954d92ce2394d3e59916685194d9057df8f1b01ae03a0aed0931450c64ac0f91" Oct 06 07:21:16 crc kubenswrapper[4845]: I1006 07:21:16.241794 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1188b365-16b7-48b5-850e-93fb6c9ec813" path="/var/lib/kubelet/pods/1188b365-16b7-48b5-850e-93fb6c9ec813/volumes" Oct 06 07:21:23 crc kubenswrapper[4845]: I1006 07:21:23.018807 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:21:23 crc kubenswrapper[4845]: I1006 07:21:23.019343 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:21:53 crc kubenswrapper[4845]: I1006 07:21:53.019587 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:21:53 crc kubenswrapper[4845]: I1006 07:21:53.020124 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:21:53 crc kubenswrapper[4845]: I1006 07:21:53.020177 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 07:21:53 crc kubenswrapper[4845]: I1006 07:21:53.021113 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 07:21:53 crc kubenswrapper[4845]: I1006 07:21:53.021180 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" gracePeriod=600 Oct 06 07:21:53 crc kubenswrapper[4845]: E1006 07:21:53.142364 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:21:53 crc kubenswrapper[4845]: I1006 07:21:53.197944 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" exitCode=0 Oct 06 07:21:53 crc kubenswrapper[4845]: I1006 07:21:53.197985 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b"} Oct 06 07:21:53 crc kubenswrapper[4845]: I1006 07:21:53.198015 4845 scope.go:117] "RemoveContainer" containerID="2c740b11eacd8eeb60421cfadeeedcffc611c0df3872f825986c7df388e5adde" Oct 06 07:21:53 crc kubenswrapper[4845]: I1006 07:21:53.198716 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:21:53 crc kubenswrapper[4845]: E1006 07:21:53.199030 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:21:54 crc kubenswrapper[4845]: I1006 07:21:54.233754 4845 generic.go:334] "Generic (PLEG): container finished" podID="50be375c-cf6d-4540-930c-e09f602c4045" containerID="f4c378037a4d780a139118e9ff5cd0136b89d47df15d2821a53437ac9cba36e5" exitCode=0 Oct 06 07:21:54 crc kubenswrapper[4845]: I1006 07:21:54.249636 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" event={"ID":"50be375c-cf6d-4540-930c-e09f602c4045","Type":"ContainerDied","Data":"f4c378037a4d780a139118e9ff5cd0136b89d47df15d2821a53437ac9cba36e5"} Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.594572 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.745851 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-libvirt-secret-0\") pod \"50be375c-cf6d-4540-930c-e09f602c4045\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.746036 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-ssh-key\") pod \"50be375c-cf6d-4540-930c-e09f602c4045\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.746111 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxwh4\" (UniqueName: \"kubernetes.io/projected/50be375c-cf6d-4540-930c-e09f602c4045-kube-api-access-lxwh4\") pod \"50be375c-cf6d-4540-930c-e09f602c4045\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.746864 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-inventory\") pod \"50be375c-cf6d-4540-930c-e09f602c4045\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.746915 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-libvirt-combined-ca-bundle\") pod \"50be375c-cf6d-4540-930c-e09f602c4045\" (UID: \"50be375c-cf6d-4540-930c-e09f602c4045\") " Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.751626 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50be375c-cf6d-4540-930c-e09f602c4045-kube-api-access-lxwh4" (OuterVolumeSpecName: "kube-api-access-lxwh4") pod "50be375c-cf6d-4540-930c-e09f602c4045" (UID: "50be375c-cf6d-4540-930c-e09f602c4045"). InnerVolumeSpecName "kube-api-access-lxwh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.752042 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "50be375c-cf6d-4540-930c-e09f602c4045" (UID: "50be375c-cf6d-4540-930c-e09f602c4045"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.771512 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "50be375c-cf6d-4540-930c-e09f602c4045" (UID: "50be375c-cf6d-4540-930c-e09f602c4045"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.772161 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "50be375c-cf6d-4540-930c-e09f602c4045" (UID: "50be375c-cf6d-4540-930c-e09f602c4045"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.779025 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-inventory" (OuterVolumeSpecName: "inventory") pod "50be375c-cf6d-4540-930c-e09f602c4045" (UID: "50be375c-cf6d-4540-930c-e09f602c4045"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.848726 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.848912 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxwh4\" (UniqueName: \"kubernetes.io/projected/50be375c-cf6d-4540-930c-e09f602c4045-kube-api-access-lxwh4\") on node \"crc\" DevicePath \"\"" Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.848996 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.849087 4845 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:21:55 crc kubenswrapper[4845]: I1006 07:21:55.849174 4845 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/50be375c-cf6d-4540-930c-e09f602c4045-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.250441 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" event={"ID":"50be375c-cf6d-4540-930c-e09f602c4045","Type":"ContainerDied","Data":"f7cb9d45a982befa5a95691e6674e7b63cd60d4bac13ebf9a3ead2fab3c51329"} Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.250846 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7cb9d45a982befa5a95691e6674e7b63cd60d4bac13ebf9a3ead2fab3c51329" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.250493 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kj28v" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.352517 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx"] Oct 06 07:21:56 crc kubenswrapper[4845]: E1006 07:21:56.352942 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1188b365-16b7-48b5-850e-93fb6c9ec813" containerName="extract-content" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.352963 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1188b365-16b7-48b5-850e-93fb6c9ec813" containerName="extract-content" Oct 06 07:21:56 crc kubenswrapper[4845]: E1006 07:21:56.352999 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1188b365-16b7-48b5-850e-93fb6c9ec813" containerName="registry-server" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.353008 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1188b365-16b7-48b5-850e-93fb6c9ec813" containerName="registry-server" Oct 06 07:21:56 crc kubenswrapper[4845]: E1006 07:21:56.353026 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50be375c-cf6d-4540-930c-e09f602c4045" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.353035 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="50be375c-cf6d-4540-930c-e09f602c4045" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 07:21:56 crc kubenswrapper[4845]: E1006 07:21:56.353057 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1188b365-16b7-48b5-850e-93fb6c9ec813" containerName="extract-utilities" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.353065 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1188b365-16b7-48b5-850e-93fb6c9ec813" containerName="extract-utilities" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.353293 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1188b365-16b7-48b5-850e-93fb6c9ec813" containerName="registry-server" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.353327 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="50be375c-cf6d-4540-930c-e09f602c4045" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.353928 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.356137 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.356280 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.356397 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.356552 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.356856 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.356958 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.357011 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.374773 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx"] Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.458708 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skr9t\" (UniqueName: \"kubernetes.io/projected/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-kube-api-access-skr9t\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.458745 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.458785 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.458880 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.458953 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.458985 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.459003 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.459072 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.459211 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.560908 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skr9t\" (UniqueName: \"kubernetes.io/projected/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-kube-api-access-skr9t\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.560958 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.561000 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.561037 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.561065 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.561091 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.561113 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.561162 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.561194 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.562190 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.564682 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.564912 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.565445 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.565899 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.566522 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.566964 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.573804 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.577516 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skr9t\" (UniqueName: \"kubernetes.io/projected/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-kube-api-access-skr9t\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gftcx\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:56 crc kubenswrapper[4845]: I1006 07:21:56.689654 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:21:57 crc kubenswrapper[4845]: I1006 07:21:57.165291 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx"] Oct 06 07:21:57 crc kubenswrapper[4845]: I1006 07:21:57.259616 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" event={"ID":"6f89fdcf-abd7-4cf4-aa6f-a05ada603477","Type":"ContainerStarted","Data":"1fce448cf3d9a36a54118766c1afdfc133d6f2070e5d37d368d1ab9d9b55ec24"} Oct 06 07:21:58 crc kubenswrapper[4845]: I1006 07:21:58.267707 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" event={"ID":"6f89fdcf-abd7-4cf4-aa6f-a05ada603477","Type":"ContainerStarted","Data":"2b8be1c103ad915c05cc018d9f497de2a69d9d9cc228cde529031741368f0cf0"} Oct 06 07:21:58 crc kubenswrapper[4845]: I1006 07:21:58.293504 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" podStartSLOduration=1.7584772869999998 podStartE2EDuration="2.293484581s" podCreationTimestamp="2025-10-06 07:21:56 +0000 UTC" firstStartedPulling="2025-10-06 07:21:57.172617981 +0000 UTC m=+2201.687358989" lastFinishedPulling="2025-10-06 07:21:57.707625275 +0000 UTC m=+2202.222366283" observedRunningTime="2025-10-06 07:21:58.284901533 +0000 UTC m=+2202.799642561" watchObservedRunningTime="2025-10-06 07:21:58.293484581 +0000 UTC m=+2202.808225589" Oct 06 07:22:07 crc kubenswrapper[4845]: I1006 07:22:07.227603 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:22:07 crc kubenswrapper[4845]: E1006 07:22:07.228498 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:22:20 crc kubenswrapper[4845]: I1006 07:22:20.226544 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:22:20 crc kubenswrapper[4845]: E1006 07:22:20.227343 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:22:31 crc kubenswrapper[4845]: I1006 07:22:31.228144 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:22:31 crc kubenswrapper[4845]: E1006 07:22:31.228895 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:22:45 crc kubenswrapper[4845]: I1006 07:22:45.226736 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:22:45 crc kubenswrapper[4845]: E1006 07:22:45.227631 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:22:59 crc kubenswrapper[4845]: I1006 07:22:59.228203 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:22:59 crc kubenswrapper[4845]: E1006 07:22:59.229192 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:23:12 crc kubenswrapper[4845]: I1006 07:23:12.227088 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:23:12 crc kubenswrapper[4845]: E1006 07:23:12.227795 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:23:24 crc kubenswrapper[4845]: I1006 07:23:24.227604 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:23:24 crc kubenswrapper[4845]: E1006 07:23:24.228329 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:23:36 crc kubenswrapper[4845]: I1006 07:23:36.231222 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:23:36 crc kubenswrapper[4845]: E1006 07:23:36.232033 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:23:49 crc kubenswrapper[4845]: I1006 07:23:49.227419 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:23:49 crc kubenswrapper[4845]: E1006 07:23:49.228129 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:24:00 crc kubenswrapper[4845]: I1006 07:24:00.234566 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:24:00 crc kubenswrapper[4845]: E1006 07:24:00.235468 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:24:15 crc kubenswrapper[4845]: I1006 07:24:15.232668 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:24:15 crc kubenswrapper[4845]: E1006 07:24:15.233711 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:24:30 crc kubenswrapper[4845]: I1006 07:24:30.227491 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:24:30 crc kubenswrapper[4845]: E1006 07:24:30.228524 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:24:44 crc kubenswrapper[4845]: I1006 07:24:44.230155 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:24:44 crc kubenswrapper[4845]: E1006 07:24:44.231571 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.574364 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j4h64"] Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.577484 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.604944 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4h64"] Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.719416 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2jsl\" (UniqueName: \"kubernetes.io/projected/f02bbc86-1787-4533-ae88-e44fdc4785b4-kube-api-access-w2jsl\") pod \"redhat-marketplace-j4h64\" (UID: \"f02bbc86-1787-4533-ae88-e44fdc4785b4\") " pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.719613 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f02bbc86-1787-4533-ae88-e44fdc4785b4-utilities\") pod \"redhat-marketplace-j4h64\" (UID: \"f02bbc86-1787-4533-ae88-e44fdc4785b4\") " pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.719648 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f02bbc86-1787-4533-ae88-e44fdc4785b4-catalog-content\") pod \"redhat-marketplace-j4h64\" (UID: \"f02bbc86-1787-4533-ae88-e44fdc4785b4\") " pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.822187 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f02bbc86-1787-4533-ae88-e44fdc4785b4-utilities\") pod \"redhat-marketplace-j4h64\" (UID: \"f02bbc86-1787-4533-ae88-e44fdc4785b4\") " pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.822231 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f02bbc86-1787-4533-ae88-e44fdc4785b4-catalog-content\") pod \"redhat-marketplace-j4h64\" (UID: \"f02bbc86-1787-4533-ae88-e44fdc4785b4\") " pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.822349 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2jsl\" (UniqueName: \"kubernetes.io/projected/f02bbc86-1787-4533-ae88-e44fdc4785b4-kube-api-access-w2jsl\") pod \"redhat-marketplace-j4h64\" (UID: \"f02bbc86-1787-4533-ae88-e44fdc4785b4\") " pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.822681 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f02bbc86-1787-4533-ae88-e44fdc4785b4-utilities\") pod \"redhat-marketplace-j4h64\" (UID: \"f02bbc86-1787-4533-ae88-e44fdc4785b4\") " pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.822912 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f02bbc86-1787-4533-ae88-e44fdc4785b4-catalog-content\") pod \"redhat-marketplace-j4h64\" (UID: \"f02bbc86-1787-4533-ae88-e44fdc4785b4\") " pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.844346 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2jsl\" (UniqueName: \"kubernetes.io/projected/f02bbc86-1787-4533-ae88-e44fdc4785b4-kube-api-access-w2jsl\") pod \"redhat-marketplace-j4h64\" (UID: \"f02bbc86-1787-4533-ae88-e44fdc4785b4\") " pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:24:55 crc kubenswrapper[4845]: I1006 07:24:55.900517 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:24:56 crc kubenswrapper[4845]: I1006 07:24:56.165476 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4h64"] Oct 06 07:24:56 crc kubenswrapper[4845]: I1006 07:24:56.730570 4845 generic.go:334] "Generic (PLEG): container finished" podID="f02bbc86-1787-4533-ae88-e44fdc4785b4" containerID="540d4ff2e2f85b780b214c3cc616c7496ea86c2668f92c175c925e82e34834d2" exitCode=0 Oct 06 07:24:56 crc kubenswrapper[4845]: I1006 07:24:56.730717 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4h64" event={"ID":"f02bbc86-1787-4533-ae88-e44fdc4785b4","Type":"ContainerDied","Data":"540d4ff2e2f85b780b214c3cc616c7496ea86c2668f92c175c925e82e34834d2"} Oct 06 07:24:56 crc kubenswrapper[4845]: I1006 07:24:56.732831 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4h64" event={"ID":"f02bbc86-1787-4533-ae88-e44fdc4785b4","Type":"ContainerStarted","Data":"44c0a504250c8d0178083c4673acedba2edf1959cee0e4864b60c65aa2b9a078"} Oct 06 07:24:57 crc kubenswrapper[4845]: I1006 07:24:57.227829 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:24:57 crc kubenswrapper[4845]: E1006 07:24:57.228838 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:24:57 crc kubenswrapper[4845]: I1006 07:24:57.742377 4845 generic.go:334] "Generic (PLEG): container finished" podID="f02bbc86-1787-4533-ae88-e44fdc4785b4" containerID="12b583665494169d5e4537197b3569ef979b443673b2ac3be380691acb3edb19" exitCode=0 Oct 06 07:24:57 crc kubenswrapper[4845]: I1006 07:24:57.742456 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4h64" event={"ID":"f02bbc86-1787-4533-ae88-e44fdc4785b4","Type":"ContainerDied","Data":"12b583665494169d5e4537197b3569ef979b443673b2ac3be380691acb3edb19"} Oct 06 07:24:58 crc kubenswrapper[4845]: I1006 07:24:58.752417 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4h64" event={"ID":"f02bbc86-1787-4533-ae88-e44fdc4785b4","Type":"ContainerStarted","Data":"994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6"} Oct 06 07:24:58 crc kubenswrapper[4845]: I1006 07:24:58.777881 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j4h64" podStartSLOduration=2.348774652 podStartE2EDuration="3.777819258s" podCreationTimestamp="2025-10-06 07:24:55 +0000 UTC" firstStartedPulling="2025-10-06 07:24:56.733795877 +0000 UTC m=+2381.248536885" lastFinishedPulling="2025-10-06 07:24:58.162840483 +0000 UTC m=+2382.677581491" observedRunningTime="2025-10-06 07:24:58.771838566 +0000 UTC m=+2383.286579584" watchObservedRunningTime="2025-10-06 07:24:58.777819258 +0000 UTC m=+2383.292560266" Oct 06 07:24:59 crc kubenswrapper[4845]: I1006 07:24:59.767189 4845 generic.go:334] "Generic (PLEG): container finished" podID="6f89fdcf-abd7-4cf4-aa6f-a05ada603477" containerID="2b8be1c103ad915c05cc018d9f497de2a69d9d9cc228cde529031741368f0cf0" exitCode=0 Oct 06 07:24:59 crc kubenswrapper[4845]: I1006 07:24:59.767341 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" event={"ID":"6f89fdcf-abd7-4cf4-aa6f-a05ada603477","Type":"ContainerDied","Data":"2b8be1c103ad915c05cc018d9f497de2a69d9d9cc228cde529031741368f0cf0"} Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.147047 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.341291 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-cell1-compute-config-0\") pod \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.342074 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-ssh-key\") pod \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.342449 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-extra-config-0\") pod \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.342497 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-cell1-compute-config-1\") pod \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.342550 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skr9t\" (UniqueName: \"kubernetes.io/projected/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-kube-api-access-skr9t\") pod \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.342583 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-combined-ca-bundle\") pod \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.342629 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-migration-ssh-key-1\") pod \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.342672 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-migration-ssh-key-0\") pod \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.342753 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-inventory\") pod \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\" (UID: \"6f89fdcf-abd7-4cf4-aa6f-a05ada603477\") " Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.350198 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6f89fdcf-abd7-4cf4-aa6f-a05ada603477" (UID: "6f89fdcf-abd7-4cf4-aa6f-a05ada603477"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.364065 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-kube-api-access-skr9t" (OuterVolumeSpecName: "kube-api-access-skr9t") pod "6f89fdcf-abd7-4cf4-aa6f-a05ada603477" (UID: "6f89fdcf-abd7-4cf4-aa6f-a05ada603477"). InnerVolumeSpecName "kube-api-access-skr9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.380068 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6f89fdcf-abd7-4cf4-aa6f-a05ada603477" (UID: "6f89fdcf-abd7-4cf4-aa6f-a05ada603477"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.382927 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-inventory" (OuterVolumeSpecName: "inventory") pod "6f89fdcf-abd7-4cf4-aa6f-a05ada603477" (UID: "6f89fdcf-abd7-4cf4-aa6f-a05ada603477"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.383619 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6f89fdcf-abd7-4cf4-aa6f-a05ada603477" (UID: "6f89fdcf-abd7-4cf4-aa6f-a05ada603477"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.386718 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "6f89fdcf-abd7-4cf4-aa6f-a05ada603477" (UID: "6f89fdcf-abd7-4cf4-aa6f-a05ada603477"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.397788 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6f89fdcf-abd7-4cf4-aa6f-a05ada603477" (UID: "6f89fdcf-abd7-4cf4-aa6f-a05ada603477"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.400157 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6f89fdcf-abd7-4cf4-aa6f-a05ada603477" (UID: "6f89fdcf-abd7-4cf4-aa6f-a05ada603477"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.417831 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6f89fdcf-abd7-4cf4-aa6f-a05ada603477" (UID: "6f89fdcf-abd7-4cf4-aa6f-a05ada603477"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.446706 4845 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.447041 4845 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.447176 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skr9t\" (UniqueName: \"kubernetes.io/projected/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-kube-api-access-skr9t\") on node \"crc\" DevicePath \"\"" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.447274 4845 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.449610 4845 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.449667 4845 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.449678 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.449691 4845 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.449703 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f89fdcf-abd7-4cf4-aa6f-a05ada603477-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.788987 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" event={"ID":"6f89fdcf-abd7-4cf4-aa6f-a05ada603477","Type":"ContainerDied","Data":"1fce448cf3d9a36a54118766c1afdfc133d6f2070e5d37d368d1ab9d9b55ec24"} Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.789026 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fce448cf3d9a36a54118766c1afdfc133d6f2070e5d37d368d1ab9d9b55ec24" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.789674 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gftcx" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.896291 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9"] Oct 06 07:25:01 crc kubenswrapper[4845]: E1006 07:25:01.896667 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f89fdcf-abd7-4cf4-aa6f-a05ada603477" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.896678 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f89fdcf-abd7-4cf4-aa6f-a05ada603477" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.896872 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f89fdcf-abd7-4cf4-aa6f-a05ada603477" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.897490 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.900050 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.906222 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-p48vv" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.906317 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.906650 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.907043 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.939673 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9"] Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.960091 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.960151 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.960316 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.960456 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.960485 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.960512 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:01 crc kubenswrapper[4845]: I1006 07:25:01.960564 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25p4x\" (UniqueName: \"kubernetes.io/projected/81f752a1-1104-4b71-9a89-1c3961584f6f-kube-api-access-25p4x\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.062238 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.062320 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.062460 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.062491 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.062522 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.062572 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25p4x\" (UniqueName: \"kubernetes.io/projected/81f752a1-1104-4b71-9a89-1c3961584f6f-kube-api-access-25p4x\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.062631 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.066764 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.066798 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.066946 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.067544 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.067649 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.068134 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.089177 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25p4x\" (UniqueName: \"kubernetes.io/projected/81f752a1-1104-4b71-9a89-1c3961584f6f-kube-api-access-25p4x\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.225641 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.786456 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9"] Oct 06 07:25:02 crc kubenswrapper[4845]: I1006 07:25:02.799664 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" event={"ID":"81f752a1-1104-4b71-9a89-1c3961584f6f","Type":"ContainerStarted","Data":"5e0efba7c545541ed87a486e57896faa5295a4815c59ef022c053e70b5201878"} Oct 06 07:25:03 crc kubenswrapper[4845]: I1006 07:25:03.809653 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" event={"ID":"81f752a1-1104-4b71-9a89-1c3961584f6f","Type":"ContainerStarted","Data":"ce002e4b2959d7d169fb0402fd0cc991cfd21aaff8a087e4c6c875b441898e74"} Oct 06 07:25:03 crc kubenswrapper[4845]: I1006 07:25:03.830076 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" podStartSLOduration=2.2723560689999998 podStartE2EDuration="2.83005468s" podCreationTimestamp="2025-10-06 07:25:01 +0000 UTC" firstStartedPulling="2025-10-06 07:25:02.783923787 +0000 UTC m=+2387.298664795" lastFinishedPulling="2025-10-06 07:25:03.341622398 +0000 UTC m=+2387.856363406" observedRunningTime="2025-10-06 07:25:03.823198966 +0000 UTC m=+2388.337939994" watchObservedRunningTime="2025-10-06 07:25:03.83005468 +0000 UTC m=+2388.344795678" Oct 06 07:25:05 crc kubenswrapper[4845]: I1006 07:25:05.901040 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:25:05 crc kubenswrapper[4845]: I1006 07:25:05.901153 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:25:05 crc kubenswrapper[4845]: I1006 07:25:05.990808 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:25:06 crc kubenswrapper[4845]: I1006 07:25:06.891165 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:25:06 crc kubenswrapper[4845]: I1006 07:25:06.946303 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4h64"] Oct 06 07:25:08 crc kubenswrapper[4845]: I1006 07:25:08.857799 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j4h64" podUID="f02bbc86-1787-4533-ae88-e44fdc4785b4" containerName="registry-server" containerID="cri-o://994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6" gracePeriod=2 Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.319703 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.508757 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f02bbc86-1787-4533-ae88-e44fdc4785b4-catalog-content\") pod \"f02bbc86-1787-4533-ae88-e44fdc4785b4\" (UID: \"f02bbc86-1787-4533-ae88-e44fdc4785b4\") " Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.509063 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f02bbc86-1787-4533-ae88-e44fdc4785b4-utilities\") pod \"f02bbc86-1787-4533-ae88-e44fdc4785b4\" (UID: \"f02bbc86-1787-4533-ae88-e44fdc4785b4\") " Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.509116 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2jsl\" (UniqueName: \"kubernetes.io/projected/f02bbc86-1787-4533-ae88-e44fdc4785b4-kube-api-access-w2jsl\") pod \"f02bbc86-1787-4533-ae88-e44fdc4785b4\" (UID: \"f02bbc86-1787-4533-ae88-e44fdc4785b4\") " Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.509720 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f02bbc86-1787-4533-ae88-e44fdc4785b4-utilities" (OuterVolumeSpecName: "utilities") pod "f02bbc86-1787-4533-ae88-e44fdc4785b4" (UID: "f02bbc86-1787-4533-ae88-e44fdc4785b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.521855 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f02bbc86-1787-4533-ae88-e44fdc4785b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f02bbc86-1787-4533-ae88-e44fdc4785b4" (UID: "f02bbc86-1787-4533-ae88-e44fdc4785b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.522122 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f02bbc86-1787-4533-ae88-e44fdc4785b4-kube-api-access-w2jsl" (OuterVolumeSpecName: "kube-api-access-w2jsl") pod "f02bbc86-1787-4533-ae88-e44fdc4785b4" (UID: "f02bbc86-1787-4533-ae88-e44fdc4785b4"). InnerVolumeSpecName "kube-api-access-w2jsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.611649 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f02bbc86-1787-4533-ae88-e44fdc4785b4-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.611679 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2jsl\" (UniqueName: \"kubernetes.io/projected/f02bbc86-1787-4533-ae88-e44fdc4785b4-kube-api-access-w2jsl\") on node \"crc\" DevicePath \"\"" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.611691 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f02bbc86-1787-4533-ae88-e44fdc4785b4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.867179 4845 generic.go:334] "Generic (PLEG): container finished" podID="f02bbc86-1787-4533-ae88-e44fdc4785b4" containerID="994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6" exitCode=0 Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.867232 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4h64" event={"ID":"f02bbc86-1787-4533-ae88-e44fdc4785b4","Type":"ContainerDied","Data":"994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6"} Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.867266 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4h64" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.868777 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4h64" event={"ID":"f02bbc86-1787-4533-ae88-e44fdc4785b4","Type":"ContainerDied","Data":"44c0a504250c8d0178083c4673acedba2edf1959cee0e4864b60c65aa2b9a078"} Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.869196 4845 scope.go:117] "RemoveContainer" containerID="994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.903450 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4h64"] Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.904631 4845 scope.go:117] "RemoveContainer" containerID="12b583665494169d5e4537197b3569ef979b443673b2ac3be380691acb3edb19" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.911430 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4h64"] Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.924745 4845 scope.go:117] "RemoveContainer" containerID="540d4ff2e2f85b780b214c3cc616c7496ea86c2668f92c175c925e82e34834d2" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.972728 4845 scope.go:117] "RemoveContainer" containerID="994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6" Oct 06 07:25:09 crc kubenswrapper[4845]: E1006 07:25:09.973167 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6\": container with ID starting with 994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6 not found: ID does not exist" containerID="994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.973196 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6"} err="failed to get container status \"994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6\": rpc error: code = NotFound desc = could not find container \"994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6\": container with ID starting with 994bb11e949ec31807ec351ac5a6b30962f6280ded2eec618b15adbc89aeb8a6 not found: ID does not exist" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.973218 4845 scope.go:117] "RemoveContainer" containerID="12b583665494169d5e4537197b3569ef979b443673b2ac3be380691acb3edb19" Oct 06 07:25:09 crc kubenswrapper[4845]: E1006 07:25:09.973607 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b583665494169d5e4537197b3569ef979b443673b2ac3be380691acb3edb19\": container with ID starting with 12b583665494169d5e4537197b3569ef979b443673b2ac3be380691acb3edb19 not found: ID does not exist" containerID="12b583665494169d5e4537197b3569ef979b443673b2ac3be380691acb3edb19" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.973633 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b583665494169d5e4537197b3569ef979b443673b2ac3be380691acb3edb19"} err="failed to get container status \"12b583665494169d5e4537197b3569ef979b443673b2ac3be380691acb3edb19\": rpc error: code = NotFound desc = could not find container \"12b583665494169d5e4537197b3569ef979b443673b2ac3be380691acb3edb19\": container with ID starting with 12b583665494169d5e4537197b3569ef979b443673b2ac3be380691acb3edb19 not found: ID does not exist" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.973649 4845 scope.go:117] "RemoveContainer" containerID="540d4ff2e2f85b780b214c3cc616c7496ea86c2668f92c175c925e82e34834d2" Oct 06 07:25:09 crc kubenswrapper[4845]: E1006 07:25:09.974023 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540d4ff2e2f85b780b214c3cc616c7496ea86c2668f92c175c925e82e34834d2\": container with ID starting with 540d4ff2e2f85b780b214c3cc616c7496ea86c2668f92c175c925e82e34834d2 not found: ID does not exist" containerID="540d4ff2e2f85b780b214c3cc616c7496ea86c2668f92c175c925e82e34834d2" Oct 06 07:25:09 crc kubenswrapper[4845]: I1006 07:25:09.974072 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540d4ff2e2f85b780b214c3cc616c7496ea86c2668f92c175c925e82e34834d2"} err="failed to get container status \"540d4ff2e2f85b780b214c3cc616c7496ea86c2668f92c175c925e82e34834d2\": rpc error: code = NotFound desc = could not find container \"540d4ff2e2f85b780b214c3cc616c7496ea86c2668f92c175c925e82e34834d2\": container with ID starting with 540d4ff2e2f85b780b214c3cc616c7496ea86c2668f92c175c925e82e34834d2 not found: ID does not exist" Oct 06 07:25:10 crc kubenswrapper[4845]: I1006 07:25:10.236129 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f02bbc86-1787-4533-ae88-e44fdc4785b4" path="/var/lib/kubelet/pods/f02bbc86-1787-4533-ae88-e44fdc4785b4/volumes" Oct 06 07:25:12 crc kubenswrapper[4845]: I1006 07:25:12.226901 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:25:12 crc kubenswrapper[4845]: E1006 07:25:12.228734 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:25:26 crc kubenswrapper[4845]: I1006 07:25:26.237171 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:25:26 crc kubenswrapper[4845]: E1006 07:25:26.238248 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:25:38 crc kubenswrapper[4845]: I1006 07:25:38.228348 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:25:38 crc kubenswrapper[4845]: E1006 07:25:38.230111 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:25:49 crc kubenswrapper[4845]: I1006 07:25:49.227287 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:25:49 crc kubenswrapper[4845]: E1006 07:25:49.227932 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:26:00 crc kubenswrapper[4845]: I1006 07:26:00.227495 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:26:00 crc kubenswrapper[4845]: E1006 07:26:00.228225 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:26:12 crc kubenswrapper[4845]: I1006 07:26:12.227155 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:26:12 crc kubenswrapper[4845]: E1006 07:26:12.229165 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:26:24 crc kubenswrapper[4845]: I1006 07:26:24.226940 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:26:24 crc kubenswrapper[4845]: E1006 07:26:24.227992 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:26:38 crc kubenswrapper[4845]: I1006 07:26:38.227108 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:26:38 crc kubenswrapper[4845]: E1006 07:26:38.227997 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:26:50 crc kubenswrapper[4845]: I1006 07:26:50.228207 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:26:50 crc kubenswrapper[4845]: E1006 07:26:50.229691 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:27:04 crc kubenswrapper[4845]: I1006 07:27:04.227483 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:27:04 crc kubenswrapper[4845]: I1006 07:27:04.933235 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"b4fc779f56264c9c35988a3aa09616764c003f7e3ff2f7b4f4f6180c8cf37a25"} Oct 06 07:27:14 crc kubenswrapper[4845]: I1006 07:27:14.018004 4845 generic.go:334] "Generic (PLEG): container finished" podID="81f752a1-1104-4b71-9a89-1c3961584f6f" containerID="ce002e4b2959d7d169fb0402fd0cc991cfd21aaff8a087e4c6c875b441898e74" exitCode=0 Oct 06 07:27:14 crc kubenswrapper[4845]: I1006 07:27:14.018060 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" event={"ID":"81f752a1-1104-4b71-9a89-1c3961584f6f","Type":"ContainerDied","Data":"ce002e4b2959d7d169fb0402fd0cc991cfd21aaff8a087e4c6c875b441898e74"} Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.505503 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.645084 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ssh-key\") pod \"81f752a1-1104-4b71-9a89-1c3961584f6f\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.645696 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-telemetry-combined-ca-bundle\") pod \"81f752a1-1104-4b71-9a89-1c3961584f6f\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.645780 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-0\") pod \"81f752a1-1104-4b71-9a89-1c3961584f6f\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.645921 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25p4x\" (UniqueName: \"kubernetes.io/projected/81f752a1-1104-4b71-9a89-1c3961584f6f-kube-api-access-25p4x\") pod \"81f752a1-1104-4b71-9a89-1c3961584f6f\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.646006 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-1\") pod \"81f752a1-1104-4b71-9a89-1c3961584f6f\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.646111 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-2\") pod \"81f752a1-1104-4b71-9a89-1c3961584f6f\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.646191 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-inventory\") pod \"81f752a1-1104-4b71-9a89-1c3961584f6f\" (UID: \"81f752a1-1104-4b71-9a89-1c3961584f6f\") " Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.650976 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "81f752a1-1104-4b71-9a89-1c3961584f6f" (UID: "81f752a1-1104-4b71-9a89-1c3961584f6f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.660192 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f752a1-1104-4b71-9a89-1c3961584f6f-kube-api-access-25p4x" (OuterVolumeSpecName: "kube-api-access-25p4x") pod "81f752a1-1104-4b71-9a89-1c3961584f6f" (UID: "81f752a1-1104-4b71-9a89-1c3961584f6f"). InnerVolumeSpecName "kube-api-access-25p4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.674677 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "81f752a1-1104-4b71-9a89-1c3961584f6f" (UID: "81f752a1-1104-4b71-9a89-1c3961584f6f"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.683242 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "81f752a1-1104-4b71-9a89-1c3961584f6f" (UID: "81f752a1-1104-4b71-9a89-1c3961584f6f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.683256 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "81f752a1-1104-4b71-9a89-1c3961584f6f" (UID: "81f752a1-1104-4b71-9a89-1c3961584f6f"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.683793 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "81f752a1-1104-4b71-9a89-1c3961584f6f" (UID: "81f752a1-1104-4b71-9a89-1c3961584f6f"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.693626 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-inventory" (OuterVolumeSpecName: "inventory") pod "81f752a1-1104-4b71-9a89-1c3961584f6f" (UID: "81f752a1-1104-4b71-9a89-1c3961584f6f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.748629 4845 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.748663 4845 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.748674 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.748684 4845 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.748695 4845 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.748705 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25p4x\" (UniqueName: \"kubernetes.io/projected/81f752a1-1104-4b71-9a89-1c3961584f6f-kube-api-access-25p4x\") on node \"crc\" DevicePath \"\"" Oct 06 07:27:15 crc kubenswrapper[4845]: I1006 07:27:15.748715 4845 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/81f752a1-1104-4b71-9a89-1c3961584f6f-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 06 07:27:16 crc kubenswrapper[4845]: I1006 07:27:16.041619 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" event={"ID":"81f752a1-1104-4b71-9a89-1c3961584f6f","Type":"ContainerDied","Data":"5e0efba7c545541ed87a486e57896faa5295a4815c59ef022c053e70b5201878"} Oct 06 07:27:16 crc kubenswrapper[4845]: I1006 07:27:16.041670 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e0efba7c545541ed87a486e57896faa5295a4815c59ef022c053e70b5201878" Oct 06 07:27:16 crc kubenswrapper[4845]: I1006 07:27:16.041787 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9" Oct 06 07:27:29 crc kubenswrapper[4845]: I1006 07:27:29.820387 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qxrxl"] Oct 06 07:27:29 crc kubenswrapper[4845]: E1006 07:27:29.821397 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02bbc86-1787-4533-ae88-e44fdc4785b4" containerName="registry-server" Oct 06 07:27:29 crc kubenswrapper[4845]: I1006 07:27:29.821414 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02bbc86-1787-4533-ae88-e44fdc4785b4" containerName="registry-server" Oct 06 07:27:29 crc kubenswrapper[4845]: E1006 07:27:29.821780 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02bbc86-1787-4533-ae88-e44fdc4785b4" containerName="extract-content" Oct 06 07:27:29 crc kubenswrapper[4845]: I1006 07:27:29.821794 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02bbc86-1787-4533-ae88-e44fdc4785b4" containerName="extract-content" Oct 06 07:27:29 crc kubenswrapper[4845]: E1006 07:27:29.821806 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f752a1-1104-4b71-9a89-1c3961584f6f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 06 07:27:29 crc kubenswrapper[4845]: I1006 07:27:29.821815 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f752a1-1104-4b71-9a89-1c3961584f6f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 06 07:27:29 crc kubenswrapper[4845]: E1006 07:27:29.821850 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02bbc86-1787-4533-ae88-e44fdc4785b4" containerName="extract-utilities" Oct 06 07:27:29 crc kubenswrapper[4845]: I1006 07:27:29.821858 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02bbc86-1787-4533-ae88-e44fdc4785b4" containerName="extract-utilities" Oct 06 07:27:29 crc kubenswrapper[4845]: I1006 07:27:29.822111 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f02bbc86-1787-4533-ae88-e44fdc4785b4" containerName="registry-server" Oct 06 07:27:29 crc kubenswrapper[4845]: I1006 07:27:29.822134 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f752a1-1104-4b71-9a89-1c3961584f6f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 06 07:27:29 crc kubenswrapper[4845]: I1006 07:27:29.823606 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:29 crc kubenswrapper[4845]: I1006 07:27:29.833231 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxrxl"] Oct 06 07:27:29 crc kubenswrapper[4845]: I1006 07:27:29.899965 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h48sb\" (UniqueName: \"kubernetes.io/projected/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-kube-api-access-h48sb\") pod \"redhat-operators-qxrxl\" (UID: \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\") " pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:29 crc kubenswrapper[4845]: I1006 07:27:29.900343 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-utilities\") pod \"redhat-operators-qxrxl\" (UID: \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\") " pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:29 crc kubenswrapper[4845]: I1006 07:27:29.900388 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-catalog-content\") pod \"redhat-operators-qxrxl\" (UID: \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\") " pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:30 crc kubenswrapper[4845]: I1006 07:27:30.002508 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h48sb\" (UniqueName: \"kubernetes.io/projected/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-kube-api-access-h48sb\") pod \"redhat-operators-qxrxl\" (UID: \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\") " pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:30 crc kubenswrapper[4845]: I1006 07:27:30.002592 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-utilities\") pod \"redhat-operators-qxrxl\" (UID: \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\") " pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:30 crc kubenswrapper[4845]: I1006 07:27:30.002619 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-catalog-content\") pod \"redhat-operators-qxrxl\" (UID: \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\") " pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:30 crc kubenswrapper[4845]: I1006 07:27:30.003111 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-utilities\") pod \"redhat-operators-qxrxl\" (UID: \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\") " pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:30 crc kubenswrapper[4845]: I1006 07:27:30.003151 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-catalog-content\") pod \"redhat-operators-qxrxl\" (UID: \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\") " pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:30 crc kubenswrapper[4845]: I1006 07:27:30.019686 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h48sb\" (UniqueName: \"kubernetes.io/projected/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-kube-api-access-h48sb\") pod \"redhat-operators-qxrxl\" (UID: \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\") " pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:30 crc kubenswrapper[4845]: I1006 07:27:30.145730 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:30 crc kubenswrapper[4845]: I1006 07:27:30.648633 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxrxl"] Oct 06 07:27:31 crc kubenswrapper[4845]: I1006 07:27:31.164911 4845 generic.go:334] "Generic (PLEG): container finished" podID="c2b7ddac-e0ba-4f8f-8126-5f513c851fab" containerID="87ae6747670084547844e21b8e97a74fd2c150f4e812ed9cce68e34647068064" exitCode=0 Oct 06 07:27:31 crc kubenswrapper[4845]: I1006 07:27:31.164968 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxrxl" event={"ID":"c2b7ddac-e0ba-4f8f-8126-5f513c851fab","Type":"ContainerDied","Data":"87ae6747670084547844e21b8e97a74fd2c150f4e812ed9cce68e34647068064"} Oct 06 07:27:31 crc kubenswrapper[4845]: I1006 07:27:31.165269 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxrxl" event={"ID":"c2b7ddac-e0ba-4f8f-8126-5f513c851fab","Type":"ContainerStarted","Data":"7011f9bee5458107f911cedb0001a06ef4ddd46830692ef2f416c8d9cf383ad5"} Oct 06 07:27:31 crc kubenswrapper[4845]: I1006 07:27:31.167014 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 07:27:32 crc kubenswrapper[4845]: I1006 07:27:32.199688 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxrxl" event={"ID":"c2b7ddac-e0ba-4f8f-8126-5f513c851fab","Type":"ContainerStarted","Data":"f96c60ed593e49bec1922f510720d58abeb495b3c73ae181fa1443793e824e5f"} Oct 06 07:27:33 crc kubenswrapper[4845]: I1006 07:27:33.209306 4845 generic.go:334] "Generic (PLEG): container finished" podID="c2b7ddac-e0ba-4f8f-8126-5f513c851fab" containerID="f96c60ed593e49bec1922f510720d58abeb495b3c73ae181fa1443793e824e5f" exitCode=0 Oct 06 07:27:33 crc kubenswrapper[4845]: I1006 07:27:33.209419 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxrxl" event={"ID":"c2b7ddac-e0ba-4f8f-8126-5f513c851fab","Type":"ContainerDied","Data":"f96c60ed593e49bec1922f510720d58abeb495b3c73ae181fa1443793e824e5f"} Oct 06 07:27:34 crc kubenswrapper[4845]: I1006 07:27:34.220633 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxrxl" event={"ID":"c2b7ddac-e0ba-4f8f-8126-5f513c851fab","Type":"ContainerStarted","Data":"8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d"} Oct 06 07:27:34 crc kubenswrapper[4845]: I1006 07:27:34.241128 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qxrxl" podStartSLOduration=2.783395065 podStartE2EDuration="5.241113123s" podCreationTimestamp="2025-10-06 07:27:29 +0000 UTC" firstStartedPulling="2025-10-06 07:27:31.166745362 +0000 UTC m=+2535.681486370" lastFinishedPulling="2025-10-06 07:27:33.62446342 +0000 UTC m=+2538.139204428" observedRunningTime="2025-10-06 07:27:34.2382037 +0000 UTC m=+2538.752944708" watchObservedRunningTime="2025-10-06 07:27:34.241113123 +0000 UTC m=+2538.755854131" Oct 06 07:27:40 crc kubenswrapper[4845]: I1006 07:27:40.147460 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:40 crc kubenswrapper[4845]: I1006 07:27:40.148359 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:40 crc kubenswrapper[4845]: I1006 07:27:40.216195 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:40 crc kubenswrapper[4845]: I1006 07:27:40.367488 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:40 crc kubenswrapper[4845]: I1006 07:27:40.461635 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxrxl"] Oct 06 07:27:42 crc kubenswrapper[4845]: I1006 07:27:42.334705 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qxrxl" podUID="c2b7ddac-e0ba-4f8f-8126-5f513c851fab" containerName="registry-server" containerID="cri-o://8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d" gracePeriod=2 Oct 06 07:27:42 crc kubenswrapper[4845]: I1006 07:27:42.858215 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:42 crc kubenswrapper[4845]: I1006 07:27:42.968274 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-utilities\") pod \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\" (UID: \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\") " Oct 06 07:27:42 crc kubenswrapper[4845]: I1006 07:27:42.968382 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-catalog-content\") pod \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\" (UID: \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\") " Oct 06 07:27:42 crc kubenswrapper[4845]: I1006 07:27:42.968539 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h48sb\" (UniqueName: \"kubernetes.io/projected/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-kube-api-access-h48sb\") pod \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\" (UID: \"c2b7ddac-e0ba-4f8f-8126-5f513c851fab\") " Oct 06 07:27:42 crc kubenswrapper[4845]: I1006 07:27:42.970338 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-utilities" (OuterVolumeSpecName: "utilities") pod "c2b7ddac-e0ba-4f8f-8126-5f513c851fab" (UID: "c2b7ddac-e0ba-4f8f-8126-5f513c851fab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:27:42 crc kubenswrapper[4845]: I1006 07:27:42.975263 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-kube-api-access-h48sb" (OuterVolumeSpecName: "kube-api-access-h48sb") pod "c2b7ddac-e0ba-4f8f-8126-5f513c851fab" (UID: "c2b7ddac-e0ba-4f8f-8126-5f513c851fab"). InnerVolumeSpecName "kube-api-access-h48sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.059089 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2b7ddac-e0ba-4f8f-8126-5f513c851fab" (UID: "c2b7ddac-e0ba-4f8f-8126-5f513c851fab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.070400 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.070437 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h48sb\" (UniqueName: \"kubernetes.io/projected/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-kube-api-access-h48sb\") on node \"crc\" DevicePath \"\"" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.070449 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b7ddac-e0ba-4f8f-8126-5f513c851fab-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.347956 4845 generic.go:334] "Generic (PLEG): container finished" podID="c2b7ddac-e0ba-4f8f-8126-5f513c851fab" containerID="8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d" exitCode=0 Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.348002 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxrxl" event={"ID":"c2b7ddac-e0ba-4f8f-8126-5f513c851fab","Type":"ContainerDied","Data":"8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d"} Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.348029 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxrxl" event={"ID":"c2b7ddac-e0ba-4f8f-8126-5f513c851fab","Type":"ContainerDied","Data":"7011f9bee5458107f911cedb0001a06ef4ddd46830692ef2f416c8d9cf383ad5"} Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.348029 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxrxl" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.348049 4845 scope.go:117] "RemoveContainer" containerID="8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.386542 4845 scope.go:117] "RemoveContainer" containerID="f96c60ed593e49bec1922f510720d58abeb495b3c73ae181fa1443793e824e5f" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.395747 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxrxl"] Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.413617 4845 scope.go:117] "RemoveContainer" containerID="87ae6747670084547844e21b8e97a74fd2c150f4e812ed9cce68e34647068064" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.422517 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qxrxl"] Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.457561 4845 scope.go:117] "RemoveContainer" containerID="8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d" Oct 06 07:27:43 crc kubenswrapper[4845]: E1006 07:27:43.457985 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d\": container with ID starting with 8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d not found: ID does not exist" containerID="8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.458042 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d"} err="failed to get container status \"8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d\": rpc error: code = NotFound desc = could not find container \"8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d\": container with ID starting with 8ebeb19eb10dedfd5557fb30d9adfa5c11ff7c737fa2c34ee2dce80ede3d9f1d not found: ID does not exist" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.458070 4845 scope.go:117] "RemoveContainer" containerID="f96c60ed593e49bec1922f510720d58abeb495b3c73ae181fa1443793e824e5f" Oct 06 07:27:43 crc kubenswrapper[4845]: E1006 07:27:43.458393 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f96c60ed593e49bec1922f510720d58abeb495b3c73ae181fa1443793e824e5f\": container with ID starting with f96c60ed593e49bec1922f510720d58abeb495b3c73ae181fa1443793e824e5f not found: ID does not exist" containerID="f96c60ed593e49bec1922f510720d58abeb495b3c73ae181fa1443793e824e5f" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.458426 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f96c60ed593e49bec1922f510720d58abeb495b3c73ae181fa1443793e824e5f"} err="failed to get container status \"f96c60ed593e49bec1922f510720d58abeb495b3c73ae181fa1443793e824e5f\": rpc error: code = NotFound desc = could not find container \"f96c60ed593e49bec1922f510720d58abeb495b3c73ae181fa1443793e824e5f\": container with ID starting with f96c60ed593e49bec1922f510720d58abeb495b3c73ae181fa1443793e824e5f not found: ID does not exist" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.458444 4845 scope.go:117] "RemoveContainer" containerID="87ae6747670084547844e21b8e97a74fd2c150f4e812ed9cce68e34647068064" Oct 06 07:27:43 crc kubenswrapper[4845]: E1006 07:27:43.458700 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ae6747670084547844e21b8e97a74fd2c150f4e812ed9cce68e34647068064\": container with ID starting with 87ae6747670084547844e21b8e97a74fd2c150f4e812ed9cce68e34647068064 not found: ID does not exist" containerID="87ae6747670084547844e21b8e97a74fd2c150f4e812ed9cce68e34647068064" Oct 06 07:27:43 crc kubenswrapper[4845]: I1006 07:27:43.458791 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ae6747670084547844e21b8e97a74fd2c150f4e812ed9cce68e34647068064"} err="failed to get container status \"87ae6747670084547844e21b8e97a74fd2c150f4e812ed9cce68e34647068064\": rpc error: code = NotFound desc = could not find container \"87ae6747670084547844e21b8e97a74fd2c150f4e812ed9cce68e34647068064\": container with ID starting with 87ae6747670084547844e21b8e97a74fd2c150f4e812ed9cce68e34647068064 not found: ID does not exist" Oct 06 07:27:44 crc kubenswrapper[4845]: I1006 07:27:44.241423 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b7ddac-e0ba-4f8f-8126-5f513c851fab" path="/var/lib/kubelet/pods/c2b7ddac-e0ba-4f8f-8126-5f513c851fab/volumes" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.754554 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 07:28:00 crc kubenswrapper[4845]: E1006 07:28:00.755520 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b7ddac-e0ba-4f8f-8126-5f513c851fab" containerName="registry-server" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.755538 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b7ddac-e0ba-4f8f-8126-5f513c851fab" containerName="registry-server" Oct 06 07:28:00 crc kubenswrapper[4845]: E1006 07:28:00.755557 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b7ddac-e0ba-4f8f-8126-5f513c851fab" containerName="extract-content" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.755565 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b7ddac-e0ba-4f8f-8126-5f513c851fab" containerName="extract-content" Oct 06 07:28:00 crc kubenswrapper[4845]: E1006 07:28:00.755586 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b7ddac-e0ba-4f8f-8126-5f513c851fab" containerName="extract-utilities" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.755595 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b7ddac-e0ba-4f8f-8126-5f513c851fab" containerName="extract-utilities" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.755761 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b7ddac-e0ba-4f8f-8126-5f513c851fab" containerName="registry-server" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.756595 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.763130 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.763454 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jqvnt" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.763581 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.763583 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.766656 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.891198 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkpd5\" (UniqueName: \"kubernetes.io/projected/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-kube-api-access-gkpd5\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.891273 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.891349 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-config-data\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.891400 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.891469 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.891532 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.891562 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.891598 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.891630 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.993313 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.993408 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.993442 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.993482 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.993512 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.993871 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.994217 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.994495 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkpd5\" (UniqueName: \"kubernetes.io/projected/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-kube-api-access-gkpd5\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.994639 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.994764 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.994853 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-config-data\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.995018 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.994577 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:00 crc kubenswrapper[4845]: I1006 07:28:00.996594 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-config-data\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:01 crc kubenswrapper[4845]: I1006 07:28:01.001076 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:01 crc kubenswrapper[4845]: I1006 07:28:01.001697 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:01 crc kubenswrapper[4845]: I1006 07:28:01.004171 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:01 crc kubenswrapper[4845]: I1006 07:28:01.011882 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkpd5\" (UniqueName: \"kubernetes.io/projected/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-kube-api-access-gkpd5\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:01 crc kubenswrapper[4845]: I1006 07:28:01.025679 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " pod="openstack/tempest-tests-tempest" Oct 06 07:28:01 crc kubenswrapper[4845]: I1006 07:28:01.075299 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 07:28:01 crc kubenswrapper[4845]: I1006 07:28:01.492443 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 07:28:02 crc kubenswrapper[4845]: I1006 07:28:02.516158 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1fdf5d3a-7d9e-4702-ae15-1373bbd94574","Type":"ContainerStarted","Data":"b612fc75e917be070d32dd6a223c0d683826a0bf36c34d1607e36074b2458206"} Oct 06 07:28:41 crc kubenswrapper[4845]: E1006 07:28:41.186562 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:b78cfc68a577b1553523c8a70a34e297" Oct 06 07:28:41 crc kubenswrapper[4845]: E1006 07:28:41.187157 4845 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:b78cfc68a577b1553523c8a70a34e297" Oct 06 07:28:41 crc kubenswrapper[4845]: E1006 07:28:41.187318 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:b78cfc68a577b1553523c8a70a34e297,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gkpd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(1fdf5d3a-7d9e-4702-ae15-1373bbd94574): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 07:28:41 crc kubenswrapper[4845]: E1006 07:28:41.188526 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="1fdf5d3a-7d9e-4702-ae15-1373bbd94574" Oct 06 07:28:41 crc kubenswrapper[4845]: E1006 07:28:41.909699 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:b78cfc68a577b1553523c8a70a34e297\\\"\"" pod="openstack/tempest-tests-tempest" podUID="1fdf5d3a-7d9e-4702-ae15-1373bbd94574" Oct 06 07:28:53 crc kubenswrapper[4845]: I1006 07:28:53.400324 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 06 07:28:55 crc kubenswrapper[4845]: I1006 07:28:55.018891 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1fdf5d3a-7d9e-4702-ae15-1373bbd94574","Type":"ContainerStarted","Data":"9772af0f187f9d7bcf36aac88902b0557fdf588742f04828b95ad85e4383d94e"} Oct 06 07:28:55 crc kubenswrapper[4845]: I1006 07:28:55.069097 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.176305117 podStartE2EDuration="56.069074433s" podCreationTimestamp="2025-10-06 07:27:59 +0000 UTC" firstStartedPulling="2025-10-06 07:28:01.505084645 +0000 UTC m=+2566.019825653" lastFinishedPulling="2025-10-06 07:28:53.397853951 +0000 UTC m=+2617.912594969" observedRunningTime="2025-10-06 07:28:55.061121034 +0000 UTC m=+2619.575862042" watchObservedRunningTime="2025-10-06 07:28:55.069074433 +0000 UTC m=+2619.583815441" Oct 06 07:29:23 crc kubenswrapper[4845]: I1006 07:29:23.019024 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:29:23 crc kubenswrapper[4845]: I1006 07:29:23.019675 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:29:53 crc kubenswrapper[4845]: I1006 07:29:53.019326 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:29:53 crc kubenswrapper[4845]: I1006 07:29:53.019837 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.156580 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62"] Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.158274 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.160297 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.161092 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.168780 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62"] Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.355931 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-secret-volume\") pod \"collect-profiles-29328930-qmx62\" (UID: \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.356505 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22bjh\" (UniqueName: \"kubernetes.io/projected/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-kube-api-access-22bjh\") pod \"collect-profiles-29328930-qmx62\" (UID: \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.356624 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-config-volume\") pod \"collect-profiles-29328930-qmx62\" (UID: \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.458681 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-secret-volume\") pod \"collect-profiles-29328930-qmx62\" (UID: \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.458733 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22bjh\" (UniqueName: \"kubernetes.io/projected/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-kube-api-access-22bjh\") pod \"collect-profiles-29328930-qmx62\" (UID: \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.458801 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-config-volume\") pod \"collect-profiles-29328930-qmx62\" (UID: \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.459894 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-config-volume\") pod \"collect-profiles-29328930-qmx62\" (UID: \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.473894 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-secret-volume\") pod \"collect-profiles-29328930-qmx62\" (UID: \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.479868 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22bjh\" (UniqueName: \"kubernetes.io/projected/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-kube-api-access-22bjh\") pod \"collect-profiles-29328930-qmx62\" (UID: \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.522749 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:00 crc kubenswrapper[4845]: I1006 07:30:00.947170 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62"] Oct 06 07:30:01 crc kubenswrapper[4845]: I1006 07:30:01.610202 4845 generic.go:334] "Generic (PLEG): container finished" podID="2d3d4e26-7e15-431d-954f-f4ea0a59f5d0" containerID="3465b0cf994ad77150113a26f81046965dd3eba81a8e745004b1bca1ed9a7251" exitCode=0 Oct 06 07:30:01 crc kubenswrapper[4845]: I1006 07:30:01.610302 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" event={"ID":"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0","Type":"ContainerDied","Data":"3465b0cf994ad77150113a26f81046965dd3eba81a8e745004b1bca1ed9a7251"} Oct 06 07:30:01 crc kubenswrapper[4845]: I1006 07:30:01.610566 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" event={"ID":"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0","Type":"ContainerStarted","Data":"1a5d3c2e8d8f071731dd342cfa545bc59fbc121c11f878da2699f3534c5076bd"} Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.060395 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.217395 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-secret-volume\") pod \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\" (UID: \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\") " Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.217455 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22bjh\" (UniqueName: \"kubernetes.io/projected/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-kube-api-access-22bjh\") pod \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\" (UID: \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\") " Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.217578 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-config-volume\") pod \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\" (UID: \"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0\") " Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.218547 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "2d3d4e26-7e15-431d-954f-f4ea0a59f5d0" (UID: "2d3d4e26-7e15-431d-954f-f4ea0a59f5d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.222855 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2d3d4e26-7e15-431d-954f-f4ea0a59f5d0" (UID: "2d3d4e26-7e15-431d-954f-f4ea0a59f5d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.230945 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-kube-api-access-22bjh" (OuterVolumeSpecName: "kube-api-access-22bjh") pod "2d3d4e26-7e15-431d-954f-f4ea0a59f5d0" (UID: "2d3d4e26-7e15-431d-954f-f4ea0a59f5d0"). InnerVolumeSpecName "kube-api-access-22bjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.319741 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.319789 4845 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.319803 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22bjh\" (UniqueName: \"kubernetes.io/projected/2d3d4e26-7e15-431d-954f-f4ea0a59f5d0-kube-api-access-22bjh\") on node \"crc\" DevicePath \"\"" Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.628814 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" event={"ID":"2d3d4e26-7e15-431d-954f-f4ea0a59f5d0","Type":"ContainerDied","Data":"1a5d3c2e8d8f071731dd342cfa545bc59fbc121c11f878da2699f3534c5076bd"} Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.628846 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328930-qmx62" Oct 06 07:30:03 crc kubenswrapper[4845]: I1006 07:30:03.628854 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a5d3c2e8d8f071731dd342cfa545bc59fbc121c11f878da2699f3534c5076bd" Oct 06 07:30:04 crc kubenswrapper[4845]: I1006 07:30:04.130302 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd"] Oct 06 07:30:04 crc kubenswrapper[4845]: I1006 07:30:04.140039 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328885-29fwd"] Oct 06 07:30:04 crc kubenswrapper[4845]: I1006 07:30:04.239228 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d009d0a-a266-4988-b410-9f0b99b66f2f" path="/var/lib/kubelet/pods/5d009d0a-a266-4988-b410-9f0b99b66f2f/volumes" Oct 06 07:30:23 crc kubenswrapper[4845]: I1006 07:30:23.019237 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:30:23 crc kubenswrapper[4845]: I1006 07:30:23.019799 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:30:23 crc kubenswrapper[4845]: I1006 07:30:23.019866 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 07:30:23 crc kubenswrapper[4845]: I1006 07:30:23.020489 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4fc779f56264c9c35988a3aa09616764c003f7e3ff2f7b4f4f6180c8cf37a25"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 07:30:23 crc kubenswrapper[4845]: I1006 07:30:23.020554 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://b4fc779f56264c9c35988a3aa09616764c003f7e3ff2f7b4f4f6180c8cf37a25" gracePeriod=600 Oct 06 07:30:23 crc kubenswrapper[4845]: I1006 07:30:23.820321 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="b4fc779f56264c9c35988a3aa09616764c003f7e3ff2f7b4f4f6180c8cf37a25" exitCode=0 Oct 06 07:30:23 crc kubenswrapper[4845]: I1006 07:30:23.820474 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"b4fc779f56264c9c35988a3aa09616764c003f7e3ff2f7b4f4f6180c8cf37a25"} Oct 06 07:30:23 crc kubenswrapper[4845]: I1006 07:30:23.820909 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9"} Oct 06 07:30:23 crc kubenswrapper[4845]: I1006 07:30:23.820935 4845 scope.go:117] "RemoveContainer" containerID="793605b8de7a0c9dbdd8a7f5052aeb307e960d873637a70048b8fd6cecbe9a5b" Oct 06 07:30:24 crc kubenswrapper[4845]: I1006 07:30:24.770357 4845 scope.go:117] "RemoveContainer" containerID="0c56dfb1bd6701038d808614418b1beead073ac4ec1e8a5f9d5f5c123af04dcc" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.266065 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pz4fr"] Oct 06 07:31:37 crc kubenswrapper[4845]: E1006 07:31:37.267195 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3d4e26-7e15-431d-954f-f4ea0a59f5d0" containerName="collect-profiles" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.267214 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3d4e26-7e15-431d-954f-f4ea0a59f5d0" containerName="collect-profiles" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.267563 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d3d4e26-7e15-431d-954f-f4ea0a59f5d0" containerName="collect-profiles" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.269412 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.279677 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pz4fr"] Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.447487 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f51c480-861a-4d08-86e9-d17a340815d0-utilities\") pod \"community-operators-pz4fr\" (UID: \"6f51c480-861a-4d08-86e9-d17a340815d0\") " pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.447576 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f51c480-861a-4d08-86e9-d17a340815d0-catalog-content\") pod \"community-operators-pz4fr\" (UID: \"6f51c480-861a-4d08-86e9-d17a340815d0\") " pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.447659 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6hkz\" (UniqueName: \"kubernetes.io/projected/6f51c480-861a-4d08-86e9-d17a340815d0-kube-api-access-v6hkz\") pod \"community-operators-pz4fr\" (UID: \"6f51c480-861a-4d08-86e9-d17a340815d0\") " pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.548789 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6hkz\" (UniqueName: \"kubernetes.io/projected/6f51c480-861a-4d08-86e9-d17a340815d0-kube-api-access-v6hkz\") pod \"community-operators-pz4fr\" (UID: \"6f51c480-861a-4d08-86e9-d17a340815d0\") " pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.548875 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f51c480-861a-4d08-86e9-d17a340815d0-utilities\") pod \"community-operators-pz4fr\" (UID: \"6f51c480-861a-4d08-86e9-d17a340815d0\") " pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.548945 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f51c480-861a-4d08-86e9-d17a340815d0-catalog-content\") pod \"community-operators-pz4fr\" (UID: \"6f51c480-861a-4d08-86e9-d17a340815d0\") " pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.549523 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f51c480-861a-4d08-86e9-d17a340815d0-catalog-content\") pod \"community-operators-pz4fr\" (UID: \"6f51c480-861a-4d08-86e9-d17a340815d0\") " pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.550035 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f51c480-861a-4d08-86e9-d17a340815d0-utilities\") pod \"community-operators-pz4fr\" (UID: \"6f51c480-861a-4d08-86e9-d17a340815d0\") " pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.578205 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6hkz\" (UniqueName: \"kubernetes.io/projected/6f51c480-861a-4d08-86e9-d17a340815d0-kube-api-access-v6hkz\") pod \"community-operators-pz4fr\" (UID: \"6f51c480-861a-4d08-86e9-d17a340815d0\") " pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:37 crc kubenswrapper[4845]: I1006 07:31:37.587520 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:38 crc kubenswrapper[4845]: I1006 07:31:38.125459 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pz4fr"] Oct 06 07:31:38 crc kubenswrapper[4845]: I1006 07:31:38.460043 4845 generic.go:334] "Generic (PLEG): container finished" podID="6f51c480-861a-4d08-86e9-d17a340815d0" containerID="47a8dd772d5234e435e70786d01ad699f1fc126182860ac0be2319798b86d55a" exitCode=0 Oct 06 07:31:38 crc kubenswrapper[4845]: I1006 07:31:38.460099 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pz4fr" event={"ID":"6f51c480-861a-4d08-86e9-d17a340815d0","Type":"ContainerDied","Data":"47a8dd772d5234e435e70786d01ad699f1fc126182860ac0be2319798b86d55a"} Oct 06 07:31:38 crc kubenswrapper[4845]: I1006 07:31:38.460126 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pz4fr" event={"ID":"6f51c480-861a-4d08-86e9-d17a340815d0","Type":"ContainerStarted","Data":"3a4d90861559364134014d072023bc95c75e31d364273ccac7076c5de4794203"} Oct 06 07:31:44 crc kubenswrapper[4845]: I1006 07:31:44.538420 4845 generic.go:334] "Generic (PLEG): container finished" podID="6f51c480-861a-4d08-86e9-d17a340815d0" containerID="eb0771592ccce355faa506091d5ccf0f9d0564df907e38027c9ffc664c06228b" exitCode=0 Oct 06 07:31:44 crc kubenswrapper[4845]: I1006 07:31:44.538520 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pz4fr" event={"ID":"6f51c480-861a-4d08-86e9-d17a340815d0","Type":"ContainerDied","Data":"eb0771592ccce355faa506091d5ccf0f9d0564df907e38027c9ffc664c06228b"} Oct 06 07:31:45 crc kubenswrapper[4845]: I1006 07:31:45.555342 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pz4fr" event={"ID":"6f51c480-861a-4d08-86e9-d17a340815d0","Type":"ContainerStarted","Data":"4ee282a388a25da56522a73385055072bdb99891e7c6148b7dd3dfcb5045c55c"} Oct 06 07:31:45 crc kubenswrapper[4845]: I1006 07:31:45.579330 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pz4fr" podStartSLOduration=2.113626539 podStartE2EDuration="8.579309532s" podCreationTimestamp="2025-10-06 07:31:37 +0000 UTC" firstStartedPulling="2025-10-06 07:31:38.462159042 +0000 UTC m=+2782.976900050" lastFinishedPulling="2025-10-06 07:31:44.927842045 +0000 UTC m=+2789.442583043" observedRunningTime="2025-10-06 07:31:45.573833245 +0000 UTC m=+2790.088574263" watchObservedRunningTime="2025-10-06 07:31:45.579309532 +0000 UTC m=+2790.094050540" Oct 06 07:31:47 crc kubenswrapper[4845]: I1006 07:31:47.588737 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:47 crc kubenswrapper[4845]: I1006 07:31:47.589810 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:47 crc kubenswrapper[4845]: I1006 07:31:47.631364 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:57 crc kubenswrapper[4845]: I1006 07:31:57.652835 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:31:57 crc kubenswrapper[4845]: I1006 07:31:57.733484 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pz4fr"] Oct 06 07:31:57 crc kubenswrapper[4845]: I1006 07:31:57.779901 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9qgph"] Oct 06 07:31:57 crc kubenswrapper[4845]: I1006 07:31:57.784473 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9qgph" podUID="ec5e118e-758c-4120-a070-fe923261cadc" containerName="registry-server" containerID="cri-o://6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111" gracePeriod=2 Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.304852 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9qgph" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.463978 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pttfh\" (UniqueName: \"kubernetes.io/projected/ec5e118e-758c-4120-a070-fe923261cadc-kube-api-access-pttfh\") pod \"ec5e118e-758c-4120-a070-fe923261cadc\" (UID: \"ec5e118e-758c-4120-a070-fe923261cadc\") " Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.464046 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec5e118e-758c-4120-a070-fe923261cadc-catalog-content\") pod \"ec5e118e-758c-4120-a070-fe923261cadc\" (UID: \"ec5e118e-758c-4120-a070-fe923261cadc\") " Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.464174 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec5e118e-758c-4120-a070-fe923261cadc-utilities\") pod \"ec5e118e-758c-4120-a070-fe923261cadc\" (UID: \"ec5e118e-758c-4120-a070-fe923261cadc\") " Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.464706 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec5e118e-758c-4120-a070-fe923261cadc-utilities" (OuterVolumeSpecName: "utilities") pod "ec5e118e-758c-4120-a070-fe923261cadc" (UID: "ec5e118e-758c-4120-a070-fe923261cadc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.466137 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec5e118e-758c-4120-a070-fe923261cadc-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.477105 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec5e118e-758c-4120-a070-fe923261cadc-kube-api-access-pttfh" (OuterVolumeSpecName: "kube-api-access-pttfh") pod "ec5e118e-758c-4120-a070-fe923261cadc" (UID: "ec5e118e-758c-4120-a070-fe923261cadc"). InnerVolumeSpecName "kube-api-access-pttfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.547176 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec5e118e-758c-4120-a070-fe923261cadc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec5e118e-758c-4120-a070-fe923261cadc" (UID: "ec5e118e-758c-4120-a070-fe923261cadc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.568144 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pttfh\" (UniqueName: \"kubernetes.io/projected/ec5e118e-758c-4120-a070-fe923261cadc-kube-api-access-pttfh\") on node \"crc\" DevicePath \"\"" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.568181 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec5e118e-758c-4120-a070-fe923261cadc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.675885 4845 generic.go:334] "Generic (PLEG): container finished" podID="ec5e118e-758c-4120-a070-fe923261cadc" containerID="6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111" exitCode=0 Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.676499 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9qgph" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.688435 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qgph" event={"ID":"ec5e118e-758c-4120-a070-fe923261cadc","Type":"ContainerDied","Data":"6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111"} Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.688479 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qgph" event={"ID":"ec5e118e-758c-4120-a070-fe923261cadc","Type":"ContainerDied","Data":"795ae49680d4571095b0549902e2da7bbb4ae68ddbd0e91da7f2c54f4b98812f"} Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.688499 4845 scope.go:117] "RemoveContainer" containerID="6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.719972 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9qgph"] Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.721550 4845 scope.go:117] "RemoveContainer" containerID="8b646d39621b8712151d3d1c39cb2f9315c04c850eef467649eef2a8aa2bf1be" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.732764 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9qgph"] Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.749835 4845 scope.go:117] "RemoveContainer" containerID="90f935dabbe4dcc535f3c4219f26b1be7ba31fdf7511a2ee3303647dd8912274" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.801537 4845 scope.go:117] "RemoveContainer" containerID="6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111" Oct 06 07:31:58 crc kubenswrapper[4845]: E1006 07:31:58.802861 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111\": container with ID starting with 6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111 not found: ID does not exist" containerID="6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.802895 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111"} err="failed to get container status \"6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111\": rpc error: code = NotFound desc = could not find container \"6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111\": container with ID starting with 6a8edcc8aa5f8a56678964d9d9a79e869ee7e4f64d55084a48d6e827591c1111 not found: ID does not exist" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.802915 4845 scope.go:117] "RemoveContainer" containerID="8b646d39621b8712151d3d1c39cb2f9315c04c850eef467649eef2a8aa2bf1be" Oct 06 07:31:58 crc kubenswrapper[4845]: E1006 07:31:58.803191 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b646d39621b8712151d3d1c39cb2f9315c04c850eef467649eef2a8aa2bf1be\": container with ID starting with 8b646d39621b8712151d3d1c39cb2f9315c04c850eef467649eef2a8aa2bf1be not found: ID does not exist" containerID="8b646d39621b8712151d3d1c39cb2f9315c04c850eef467649eef2a8aa2bf1be" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.803214 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b646d39621b8712151d3d1c39cb2f9315c04c850eef467649eef2a8aa2bf1be"} err="failed to get container status \"8b646d39621b8712151d3d1c39cb2f9315c04c850eef467649eef2a8aa2bf1be\": rpc error: code = NotFound desc = could not find container \"8b646d39621b8712151d3d1c39cb2f9315c04c850eef467649eef2a8aa2bf1be\": container with ID starting with 8b646d39621b8712151d3d1c39cb2f9315c04c850eef467649eef2a8aa2bf1be not found: ID does not exist" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.803228 4845 scope.go:117] "RemoveContainer" containerID="90f935dabbe4dcc535f3c4219f26b1be7ba31fdf7511a2ee3303647dd8912274" Oct 06 07:31:58 crc kubenswrapper[4845]: E1006 07:31:58.803458 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f935dabbe4dcc535f3c4219f26b1be7ba31fdf7511a2ee3303647dd8912274\": container with ID starting with 90f935dabbe4dcc535f3c4219f26b1be7ba31fdf7511a2ee3303647dd8912274 not found: ID does not exist" containerID="90f935dabbe4dcc535f3c4219f26b1be7ba31fdf7511a2ee3303647dd8912274" Oct 06 07:31:58 crc kubenswrapper[4845]: I1006 07:31:58.803479 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f935dabbe4dcc535f3c4219f26b1be7ba31fdf7511a2ee3303647dd8912274"} err="failed to get container status \"90f935dabbe4dcc535f3c4219f26b1be7ba31fdf7511a2ee3303647dd8912274\": rpc error: code = NotFound desc = could not find container \"90f935dabbe4dcc535f3c4219f26b1be7ba31fdf7511a2ee3303647dd8912274\": container with ID starting with 90f935dabbe4dcc535f3c4219f26b1be7ba31fdf7511a2ee3303647dd8912274 not found: ID does not exist" Oct 06 07:32:00 crc kubenswrapper[4845]: I1006 07:32:00.236593 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec5e118e-758c-4120-a070-fe923261cadc" path="/var/lib/kubelet/pods/ec5e118e-758c-4120-a070-fe923261cadc/volumes" Oct 06 07:32:23 crc kubenswrapper[4845]: I1006 07:32:23.019552 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:32:23 crc kubenswrapper[4845]: I1006 07:32:23.020169 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:32:53 crc kubenswrapper[4845]: I1006 07:32:53.019321 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:32:53 crc kubenswrapper[4845]: I1006 07:32:53.020521 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:33:23 crc kubenswrapper[4845]: I1006 07:33:23.020112 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:33:23 crc kubenswrapper[4845]: I1006 07:33:23.020778 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:33:23 crc kubenswrapper[4845]: I1006 07:33:23.020842 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 07:33:23 crc kubenswrapper[4845]: I1006 07:33:23.021650 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 07:33:23 crc kubenswrapper[4845]: I1006 07:33:23.021897 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" gracePeriod=600 Oct 06 07:33:23 crc kubenswrapper[4845]: E1006 07:33:23.145336 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:33:23 crc kubenswrapper[4845]: I1006 07:33:23.377853 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" exitCode=0 Oct 06 07:33:23 crc kubenswrapper[4845]: I1006 07:33:23.377901 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9"} Oct 06 07:33:23 crc kubenswrapper[4845]: I1006 07:33:23.377945 4845 scope.go:117] "RemoveContainer" containerID="b4fc779f56264c9c35988a3aa09616764c003f7e3ff2f7b4f4f6180c8cf37a25" Oct 06 07:33:23 crc kubenswrapper[4845]: I1006 07:33:23.378876 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:33:23 crc kubenswrapper[4845]: E1006 07:33:23.379135 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:33:38 crc kubenswrapper[4845]: I1006 07:33:38.226982 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:33:38 crc kubenswrapper[4845]: E1006 07:33:38.227853 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:33:51 crc kubenswrapper[4845]: I1006 07:33:51.226740 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:33:51 crc kubenswrapper[4845]: E1006 07:33:51.228806 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:34:06 crc kubenswrapper[4845]: I1006 07:34:06.234491 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:34:06 crc kubenswrapper[4845]: E1006 07:34:06.235349 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.384943 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jv2fm"] Oct 06 07:34:13 crc kubenswrapper[4845]: E1006 07:34:13.386979 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5e118e-758c-4120-a070-fe923261cadc" containerName="extract-content" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.387062 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5e118e-758c-4120-a070-fe923261cadc" containerName="extract-content" Oct 06 07:34:13 crc kubenswrapper[4845]: E1006 07:34:13.387127 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5e118e-758c-4120-a070-fe923261cadc" containerName="extract-utilities" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.387183 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5e118e-758c-4120-a070-fe923261cadc" containerName="extract-utilities" Oct 06 07:34:13 crc kubenswrapper[4845]: E1006 07:34:13.387247 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5e118e-758c-4120-a070-fe923261cadc" containerName="registry-server" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.387300 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5e118e-758c-4120-a070-fe923261cadc" containerName="registry-server" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.387552 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5e118e-758c-4120-a070-fe923261cadc" containerName="registry-server" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.389209 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.398728 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jv2fm"] Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.526717 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7891ccb3-d647-41f3-a207-5df1f74bb64e-utilities\") pod \"certified-operators-jv2fm\" (UID: \"7891ccb3-d647-41f3-a207-5df1f74bb64e\") " pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.527073 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7891ccb3-d647-41f3-a207-5df1f74bb64e-catalog-content\") pod \"certified-operators-jv2fm\" (UID: \"7891ccb3-d647-41f3-a207-5df1f74bb64e\") " pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.527108 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbmzj\" (UniqueName: \"kubernetes.io/projected/7891ccb3-d647-41f3-a207-5df1f74bb64e-kube-api-access-hbmzj\") pod \"certified-operators-jv2fm\" (UID: \"7891ccb3-d647-41f3-a207-5df1f74bb64e\") " pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.628544 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7891ccb3-d647-41f3-a207-5df1f74bb64e-utilities\") pod \"certified-operators-jv2fm\" (UID: \"7891ccb3-d647-41f3-a207-5df1f74bb64e\") " pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.628847 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7891ccb3-d647-41f3-a207-5df1f74bb64e-catalog-content\") pod \"certified-operators-jv2fm\" (UID: \"7891ccb3-d647-41f3-a207-5df1f74bb64e\") " pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.628956 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbmzj\" (UniqueName: \"kubernetes.io/projected/7891ccb3-d647-41f3-a207-5df1f74bb64e-kube-api-access-hbmzj\") pod \"certified-operators-jv2fm\" (UID: \"7891ccb3-d647-41f3-a207-5df1f74bb64e\") " pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.629177 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7891ccb3-d647-41f3-a207-5df1f74bb64e-utilities\") pod \"certified-operators-jv2fm\" (UID: \"7891ccb3-d647-41f3-a207-5df1f74bb64e\") " pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.629265 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7891ccb3-d647-41f3-a207-5df1f74bb64e-catalog-content\") pod \"certified-operators-jv2fm\" (UID: \"7891ccb3-d647-41f3-a207-5df1f74bb64e\") " pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.654600 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbmzj\" (UniqueName: \"kubernetes.io/projected/7891ccb3-d647-41f3-a207-5df1f74bb64e-kube-api-access-hbmzj\") pod \"certified-operators-jv2fm\" (UID: \"7891ccb3-d647-41f3-a207-5df1f74bb64e\") " pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:13 crc kubenswrapper[4845]: I1006 07:34:13.711688 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:14 crc kubenswrapper[4845]: I1006 07:34:14.220829 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jv2fm"] Oct 06 07:34:14 crc kubenswrapper[4845]: I1006 07:34:14.808461 4845 generic.go:334] "Generic (PLEG): container finished" podID="7891ccb3-d647-41f3-a207-5df1f74bb64e" containerID="bcc6f14ab90b771da2d5d3c89ca1e0316a73790305af9562b5567674dc12ae70" exitCode=0 Oct 06 07:34:14 crc kubenswrapper[4845]: I1006 07:34:14.808521 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv2fm" event={"ID":"7891ccb3-d647-41f3-a207-5df1f74bb64e","Type":"ContainerDied","Data":"bcc6f14ab90b771da2d5d3c89ca1e0316a73790305af9562b5567674dc12ae70"} Oct 06 07:34:14 crc kubenswrapper[4845]: I1006 07:34:14.808766 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv2fm" event={"ID":"7891ccb3-d647-41f3-a207-5df1f74bb64e","Type":"ContainerStarted","Data":"1f2841bb46d8ea6db41435d58fcff4ce436971382a1a953f970479dcd0304631"} Oct 06 07:34:14 crc kubenswrapper[4845]: I1006 07:34:14.810258 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 07:34:15 crc kubenswrapper[4845]: I1006 07:34:15.820774 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv2fm" event={"ID":"7891ccb3-d647-41f3-a207-5df1f74bb64e","Type":"ContainerStarted","Data":"de5e5046e49a829e6226ba29f688375e436ee09bfa7260ce108b99ed807fb2bf"} Oct 06 07:34:16 crc kubenswrapper[4845]: I1006 07:34:16.830933 4845 generic.go:334] "Generic (PLEG): container finished" podID="7891ccb3-d647-41f3-a207-5df1f74bb64e" containerID="de5e5046e49a829e6226ba29f688375e436ee09bfa7260ce108b99ed807fb2bf" exitCode=0 Oct 06 07:34:16 crc kubenswrapper[4845]: I1006 07:34:16.830993 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv2fm" event={"ID":"7891ccb3-d647-41f3-a207-5df1f74bb64e","Type":"ContainerDied","Data":"de5e5046e49a829e6226ba29f688375e436ee09bfa7260ce108b99ed807fb2bf"} Oct 06 07:34:17 crc kubenswrapper[4845]: I1006 07:34:17.228076 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:34:17 crc kubenswrapper[4845]: E1006 07:34:17.228429 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:34:17 crc kubenswrapper[4845]: I1006 07:34:17.843186 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv2fm" event={"ID":"7891ccb3-d647-41f3-a207-5df1f74bb64e","Type":"ContainerStarted","Data":"435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091"} Oct 06 07:34:17 crc kubenswrapper[4845]: I1006 07:34:17.866409 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jv2fm" podStartSLOduration=2.327453738 podStartE2EDuration="4.866390264s" podCreationTimestamp="2025-10-06 07:34:13 +0000 UTC" firstStartedPulling="2025-10-06 07:34:14.810029763 +0000 UTC m=+2939.324770771" lastFinishedPulling="2025-10-06 07:34:17.348966289 +0000 UTC m=+2941.863707297" observedRunningTime="2025-10-06 07:34:17.860849273 +0000 UTC m=+2942.375590291" watchObservedRunningTime="2025-10-06 07:34:17.866390264 +0000 UTC m=+2942.381131272" Oct 06 07:34:23 crc kubenswrapper[4845]: I1006 07:34:23.712762 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:23 crc kubenswrapper[4845]: I1006 07:34:23.713417 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:23 crc kubenswrapper[4845]: I1006 07:34:23.759957 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:23 crc kubenswrapper[4845]: I1006 07:34:23.936225 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:24 crc kubenswrapper[4845]: I1006 07:34:24.923285 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jv2fm"] Oct 06 07:34:25 crc kubenswrapper[4845]: I1006 07:34:25.912001 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jv2fm" podUID="7891ccb3-d647-41f3-a207-5df1f74bb64e" containerName="registry-server" containerID="cri-o://435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091" gracePeriod=2 Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.391871 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.485786 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7891ccb3-d647-41f3-a207-5df1f74bb64e-utilities\") pod \"7891ccb3-d647-41f3-a207-5df1f74bb64e\" (UID: \"7891ccb3-d647-41f3-a207-5df1f74bb64e\") " Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.485915 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbmzj\" (UniqueName: \"kubernetes.io/projected/7891ccb3-d647-41f3-a207-5df1f74bb64e-kube-api-access-hbmzj\") pod \"7891ccb3-d647-41f3-a207-5df1f74bb64e\" (UID: \"7891ccb3-d647-41f3-a207-5df1f74bb64e\") " Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.486052 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7891ccb3-d647-41f3-a207-5df1f74bb64e-catalog-content\") pod \"7891ccb3-d647-41f3-a207-5df1f74bb64e\" (UID: \"7891ccb3-d647-41f3-a207-5df1f74bb64e\") " Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.486660 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7891ccb3-d647-41f3-a207-5df1f74bb64e-utilities" (OuterVolumeSpecName: "utilities") pod "7891ccb3-d647-41f3-a207-5df1f74bb64e" (UID: "7891ccb3-d647-41f3-a207-5df1f74bb64e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.492330 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7891ccb3-d647-41f3-a207-5df1f74bb64e-kube-api-access-hbmzj" (OuterVolumeSpecName: "kube-api-access-hbmzj") pod "7891ccb3-d647-41f3-a207-5df1f74bb64e" (UID: "7891ccb3-d647-41f3-a207-5df1f74bb64e"). InnerVolumeSpecName "kube-api-access-hbmzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.539184 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7891ccb3-d647-41f3-a207-5df1f74bb64e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7891ccb3-d647-41f3-a207-5df1f74bb64e" (UID: "7891ccb3-d647-41f3-a207-5df1f74bb64e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.587844 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7891ccb3-d647-41f3-a207-5df1f74bb64e-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.587993 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbmzj\" (UniqueName: \"kubernetes.io/projected/7891ccb3-d647-41f3-a207-5df1f74bb64e-kube-api-access-hbmzj\") on node \"crc\" DevicePath \"\"" Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.588118 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7891ccb3-d647-41f3-a207-5df1f74bb64e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.927820 4845 generic.go:334] "Generic (PLEG): container finished" podID="7891ccb3-d647-41f3-a207-5df1f74bb64e" containerID="435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091" exitCode=0 Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.927926 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv2fm" event={"ID":"7891ccb3-d647-41f3-a207-5df1f74bb64e","Type":"ContainerDied","Data":"435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091"} Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.927967 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jv2fm" Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.928009 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv2fm" event={"ID":"7891ccb3-d647-41f3-a207-5df1f74bb64e","Type":"ContainerDied","Data":"1f2841bb46d8ea6db41435d58fcff4ce436971382a1a953f970479dcd0304631"} Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.928051 4845 scope.go:117] "RemoveContainer" containerID="435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091" Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.955420 4845 scope.go:117] "RemoveContainer" containerID="de5e5046e49a829e6226ba29f688375e436ee09bfa7260ce108b99ed807fb2bf" Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.985316 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jv2fm"] Oct 06 07:34:26 crc kubenswrapper[4845]: I1006 07:34:26.992789 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jv2fm"] Oct 06 07:34:27 crc kubenswrapper[4845]: I1006 07:34:27.004541 4845 scope.go:117] "RemoveContainer" containerID="bcc6f14ab90b771da2d5d3c89ca1e0316a73790305af9562b5567674dc12ae70" Oct 06 07:34:27 crc kubenswrapper[4845]: I1006 07:34:27.056872 4845 scope.go:117] "RemoveContainer" containerID="435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091" Oct 06 07:34:27 crc kubenswrapper[4845]: E1006 07:34:27.057480 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091\": container with ID starting with 435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091 not found: ID does not exist" containerID="435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091" Oct 06 07:34:27 crc kubenswrapper[4845]: I1006 07:34:27.057519 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091"} err="failed to get container status \"435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091\": rpc error: code = NotFound desc = could not find container \"435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091\": container with ID starting with 435c6fc2529298bebb3bf632ad14a73c5c91d4101cd5fe9281d1a52aab876091 not found: ID does not exist" Oct 06 07:34:27 crc kubenswrapper[4845]: I1006 07:34:27.057544 4845 scope.go:117] "RemoveContainer" containerID="de5e5046e49a829e6226ba29f688375e436ee09bfa7260ce108b99ed807fb2bf" Oct 06 07:34:27 crc kubenswrapper[4845]: E1006 07:34:27.057923 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5e5046e49a829e6226ba29f688375e436ee09bfa7260ce108b99ed807fb2bf\": container with ID starting with de5e5046e49a829e6226ba29f688375e436ee09bfa7260ce108b99ed807fb2bf not found: ID does not exist" containerID="de5e5046e49a829e6226ba29f688375e436ee09bfa7260ce108b99ed807fb2bf" Oct 06 07:34:27 crc kubenswrapper[4845]: I1006 07:34:27.057952 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5e5046e49a829e6226ba29f688375e436ee09bfa7260ce108b99ed807fb2bf"} err="failed to get container status \"de5e5046e49a829e6226ba29f688375e436ee09bfa7260ce108b99ed807fb2bf\": rpc error: code = NotFound desc = could not find container \"de5e5046e49a829e6226ba29f688375e436ee09bfa7260ce108b99ed807fb2bf\": container with ID starting with de5e5046e49a829e6226ba29f688375e436ee09bfa7260ce108b99ed807fb2bf not found: ID does not exist" Oct 06 07:34:27 crc kubenswrapper[4845]: I1006 07:34:27.057973 4845 scope.go:117] "RemoveContainer" containerID="bcc6f14ab90b771da2d5d3c89ca1e0316a73790305af9562b5567674dc12ae70" Oct 06 07:34:27 crc kubenswrapper[4845]: E1006 07:34:27.058274 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcc6f14ab90b771da2d5d3c89ca1e0316a73790305af9562b5567674dc12ae70\": container with ID starting with bcc6f14ab90b771da2d5d3c89ca1e0316a73790305af9562b5567674dc12ae70 not found: ID does not exist" containerID="bcc6f14ab90b771da2d5d3c89ca1e0316a73790305af9562b5567674dc12ae70" Oct 06 07:34:27 crc kubenswrapper[4845]: I1006 07:34:27.058305 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc6f14ab90b771da2d5d3c89ca1e0316a73790305af9562b5567674dc12ae70"} err="failed to get container status \"bcc6f14ab90b771da2d5d3c89ca1e0316a73790305af9562b5567674dc12ae70\": rpc error: code = NotFound desc = could not find container \"bcc6f14ab90b771da2d5d3c89ca1e0316a73790305af9562b5567674dc12ae70\": container with ID starting with bcc6f14ab90b771da2d5d3c89ca1e0316a73790305af9562b5567674dc12ae70 not found: ID does not exist" Oct 06 07:34:28 crc kubenswrapper[4845]: I1006 07:34:28.238317 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7891ccb3-d647-41f3-a207-5df1f74bb64e" path="/var/lib/kubelet/pods/7891ccb3-d647-41f3-a207-5df1f74bb64e/volumes" Oct 06 07:34:29 crc kubenswrapper[4845]: I1006 07:34:29.227922 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:34:29 crc kubenswrapper[4845]: E1006 07:34:29.229107 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:34:44 crc kubenswrapper[4845]: I1006 07:34:44.227441 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:34:44 crc kubenswrapper[4845]: E1006 07:34:44.228086 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:34:59 crc kubenswrapper[4845]: I1006 07:34:59.226838 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:34:59 crc kubenswrapper[4845]: E1006 07:34:59.227764 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:35:14 crc kubenswrapper[4845]: I1006 07:35:14.227145 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:35:14 crc kubenswrapper[4845]: E1006 07:35:14.229712 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:35:26 crc kubenswrapper[4845]: I1006 07:35:26.236806 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:35:26 crc kubenswrapper[4845]: E1006 07:35:26.237754 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.374716 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r2dbw"] Oct 06 07:35:32 crc kubenswrapper[4845]: E1006 07:35:32.375953 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7891ccb3-d647-41f3-a207-5df1f74bb64e" containerName="registry-server" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.375974 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7891ccb3-d647-41f3-a207-5df1f74bb64e" containerName="registry-server" Oct 06 07:35:32 crc kubenswrapper[4845]: E1006 07:35:32.376007 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7891ccb3-d647-41f3-a207-5df1f74bb64e" containerName="extract-utilities" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.376021 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7891ccb3-d647-41f3-a207-5df1f74bb64e" containerName="extract-utilities" Oct 06 07:35:32 crc kubenswrapper[4845]: E1006 07:35:32.376062 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7891ccb3-d647-41f3-a207-5df1f74bb64e" containerName="extract-content" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.376075 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7891ccb3-d647-41f3-a207-5df1f74bb64e" containerName="extract-content" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.376482 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7891ccb3-d647-41f3-a207-5df1f74bb64e" containerName="registry-server" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.378904 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.391034 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2dbw"] Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.483319 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87313962-820c-4ca5-8a6a-f656bcddc2d7-utilities\") pod \"redhat-marketplace-r2dbw\" (UID: \"87313962-820c-4ca5-8a6a-f656bcddc2d7\") " pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.483371 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87313962-820c-4ca5-8a6a-f656bcddc2d7-catalog-content\") pod \"redhat-marketplace-r2dbw\" (UID: \"87313962-820c-4ca5-8a6a-f656bcddc2d7\") " pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.483503 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-895fp\" (UniqueName: \"kubernetes.io/projected/87313962-820c-4ca5-8a6a-f656bcddc2d7-kube-api-access-895fp\") pod \"redhat-marketplace-r2dbw\" (UID: \"87313962-820c-4ca5-8a6a-f656bcddc2d7\") " pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.585239 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87313962-820c-4ca5-8a6a-f656bcddc2d7-utilities\") pod \"redhat-marketplace-r2dbw\" (UID: \"87313962-820c-4ca5-8a6a-f656bcddc2d7\") " pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.585295 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87313962-820c-4ca5-8a6a-f656bcddc2d7-catalog-content\") pod \"redhat-marketplace-r2dbw\" (UID: \"87313962-820c-4ca5-8a6a-f656bcddc2d7\") " pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.585405 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-895fp\" (UniqueName: \"kubernetes.io/projected/87313962-820c-4ca5-8a6a-f656bcddc2d7-kube-api-access-895fp\") pod \"redhat-marketplace-r2dbw\" (UID: \"87313962-820c-4ca5-8a6a-f656bcddc2d7\") " pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.586359 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87313962-820c-4ca5-8a6a-f656bcddc2d7-utilities\") pod \"redhat-marketplace-r2dbw\" (UID: \"87313962-820c-4ca5-8a6a-f656bcddc2d7\") " pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.586459 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87313962-820c-4ca5-8a6a-f656bcddc2d7-catalog-content\") pod \"redhat-marketplace-r2dbw\" (UID: \"87313962-820c-4ca5-8a6a-f656bcddc2d7\") " pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.609565 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-895fp\" (UniqueName: \"kubernetes.io/projected/87313962-820c-4ca5-8a6a-f656bcddc2d7-kube-api-access-895fp\") pod \"redhat-marketplace-r2dbw\" (UID: \"87313962-820c-4ca5-8a6a-f656bcddc2d7\") " pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:32 crc kubenswrapper[4845]: I1006 07:35:32.709241 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:33 crc kubenswrapper[4845]: I1006 07:35:33.165257 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2dbw"] Oct 06 07:35:33 crc kubenswrapper[4845]: I1006 07:35:33.525287 4845 generic.go:334] "Generic (PLEG): container finished" podID="87313962-820c-4ca5-8a6a-f656bcddc2d7" containerID="53c84979698ee17de82612ec4af5b7ca3d7bf150eee48c3cc145fe92fbbfeffe" exitCode=0 Oct 06 07:35:33 crc kubenswrapper[4845]: I1006 07:35:33.525416 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2dbw" event={"ID":"87313962-820c-4ca5-8a6a-f656bcddc2d7","Type":"ContainerDied","Data":"53c84979698ee17de82612ec4af5b7ca3d7bf150eee48c3cc145fe92fbbfeffe"} Oct 06 07:35:33 crc kubenswrapper[4845]: I1006 07:35:33.525605 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2dbw" event={"ID":"87313962-820c-4ca5-8a6a-f656bcddc2d7","Type":"ContainerStarted","Data":"eeaf276541359d3e198cb8806703fcb140735be195bcc66f32fc8e860c3c6d16"} Oct 06 07:35:34 crc kubenswrapper[4845]: I1006 07:35:34.535730 4845 generic.go:334] "Generic (PLEG): container finished" podID="87313962-820c-4ca5-8a6a-f656bcddc2d7" containerID="ef71f887bb1200c2d32a787480394bbee10525ac428cfc47129226ccb7f0e512" exitCode=0 Oct 06 07:35:34 crc kubenswrapper[4845]: I1006 07:35:34.535772 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2dbw" event={"ID":"87313962-820c-4ca5-8a6a-f656bcddc2d7","Type":"ContainerDied","Data":"ef71f887bb1200c2d32a787480394bbee10525ac428cfc47129226ccb7f0e512"} Oct 06 07:35:35 crc kubenswrapper[4845]: I1006 07:35:35.554023 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2dbw" event={"ID":"87313962-820c-4ca5-8a6a-f656bcddc2d7","Type":"ContainerStarted","Data":"f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571"} Oct 06 07:35:35 crc kubenswrapper[4845]: I1006 07:35:35.578529 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r2dbw" podStartSLOduration=2.153530652 podStartE2EDuration="3.578508027s" podCreationTimestamp="2025-10-06 07:35:32 +0000 UTC" firstStartedPulling="2025-10-06 07:35:33.527507884 +0000 UTC m=+3018.042248892" lastFinishedPulling="2025-10-06 07:35:34.952485259 +0000 UTC m=+3019.467226267" observedRunningTime="2025-10-06 07:35:35.577539502 +0000 UTC m=+3020.092280520" watchObservedRunningTime="2025-10-06 07:35:35.578508027 +0000 UTC m=+3020.093249135" Oct 06 07:35:39 crc kubenswrapper[4845]: I1006 07:35:39.227435 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:35:39 crc kubenswrapper[4845]: E1006 07:35:39.228457 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:35:42 crc kubenswrapper[4845]: I1006 07:35:42.710005 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:42 crc kubenswrapper[4845]: I1006 07:35:42.710608 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:42 crc kubenswrapper[4845]: I1006 07:35:42.758288 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:43 crc kubenswrapper[4845]: I1006 07:35:43.686146 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:43 crc kubenswrapper[4845]: I1006 07:35:43.738687 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2dbw"] Oct 06 07:35:45 crc kubenswrapper[4845]: I1006 07:35:45.632775 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r2dbw" podUID="87313962-820c-4ca5-8a6a-f656bcddc2d7" containerName="registry-server" containerID="cri-o://f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571" gracePeriod=2 Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.131474 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.263739 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87313962-820c-4ca5-8a6a-f656bcddc2d7-catalog-content\") pod \"87313962-820c-4ca5-8a6a-f656bcddc2d7\" (UID: \"87313962-820c-4ca5-8a6a-f656bcddc2d7\") " Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.263894 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87313962-820c-4ca5-8a6a-f656bcddc2d7-utilities\") pod \"87313962-820c-4ca5-8a6a-f656bcddc2d7\" (UID: \"87313962-820c-4ca5-8a6a-f656bcddc2d7\") " Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.263935 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-895fp\" (UniqueName: \"kubernetes.io/projected/87313962-820c-4ca5-8a6a-f656bcddc2d7-kube-api-access-895fp\") pod \"87313962-820c-4ca5-8a6a-f656bcddc2d7\" (UID: \"87313962-820c-4ca5-8a6a-f656bcddc2d7\") " Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.264801 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87313962-820c-4ca5-8a6a-f656bcddc2d7-utilities" (OuterVolumeSpecName: "utilities") pod "87313962-820c-4ca5-8a6a-f656bcddc2d7" (UID: "87313962-820c-4ca5-8a6a-f656bcddc2d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.278288 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87313962-820c-4ca5-8a6a-f656bcddc2d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87313962-820c-4ca5-8a6a-f656bcddc2d7" (UID: "87313962-820c-4ca5-8a6a-f656bcddc2d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.278731 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87313962-820c-4ca5-8a6a-f656bcddc2d7-kube-api-access-895fp" (OuterVolumeSpecName: "kube-api-access-895fp") pod "87313962-820c-4ca5-8a6a-f656bcddc2d7" (UID: "87313962-820c-4ca5-8a6a-f656bcddc2d7"). InnerVolumeSpecName "kube-api-access-895fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.366156 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87313962-820c-4ca5-8a6a-f656bcddc2d7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.366187 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87313962-820c-4ca5-8a6a-f656bcddc2d7-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.366197 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-895fp\" (UniqueName: \"kubernetes.io/projected/87313962-820c-4ca5-8a6a-f656bcddc2d7-kube-api-access-895fp\") on node \"crc\" DevicePath \"\"" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.642616 4845 generic.go:334] "Generic (PLEG): container finished" podID="87313962-820c-4ca5-8a6a-f656bcddc2d7" containerID="f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571" exitCode=0 Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.642661 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2dbw" event={"ID":"87313962-820c-4ca5-8a6a-f656bcddc2d7","Type":"ContainerDied","Data":"f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571"} Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.642689 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2dbw" event={"ID":"87313962-820c-4ca5-8a6a-f656bcddc2d7","Type":"ContainerDied","Data":"eeaf276541359d3e198cb8806703fcb140735be195bcc66f32fc8e860c3c6d16"} Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.642689 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2dbw" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.642708 4845 scope.go:117] "RemoveContainer" containerID="f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.660904 4845 scope.go:117] "RemoveContainer" containerID="ef71f887bb1200c2d32a787480394bbee10525ac428cfc47129226ccb7f0e512" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.677327 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2dbw"] Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.685689 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2dbw"] Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.694947 4845 scope.go:117] "RemoveContainer" containerID="53c84979698ee17de82612ec4af5b7ca3d7bf150eee48c3cc145fe92fbbfeffe" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.724548 4845 scope.go:117] "RemoveContainer" containerID="f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571" Oct 06 07:35:46 crc kubenswrapper[4845]: E1006 07:35:46.725146 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571\": container with ID starting with f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571 not found: ID does not exist" containerID="f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.725181 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571"} err="failed to get container status \"f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571\": rpc error: code = NotFound desc = could not find container \"f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571\": container with ID starting with f2d2062973201eae80960c8548c12574a64d4f2f481d835ca373be9115e5a571 not found: ID does not exist" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.725201 4845 scope.go:117] "RemoveContainer" containerID="ef71f887bb1200c2d32a787480394bbee10525ac428cfc47129226ccb7f0e512" Oct 06 07:35:46 crc kubenswrapper[4845]: E1006 07:35:46.725735 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef71f887bb1200c2d32a787480394bbee10525ac428cfc47129226ccb7f0e512\": container with ID starting with ef71f887bb1200c2d32a787480394bbee10525ac428cfc47129226ccb7f0e512 not found: ID does not exist" containerID="ef71f887bb1200c2d32a787480394bbee10525ac428cfc47129226ccb7f0e512" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.725777 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef71f887bb1200c2d32a787480394bbee10525ac428cfc47129226ccb7f0e512"} err="failed to get container status \"ef71f887bb1200c2d32a787480394bbee10525ac428cfc47129226ccb7f0e512\": rpc error: code = NotFound desc = could not find container \"ef71f887bb1200c2d32a787480394bbee10525ac428cfc47129226ccb7f0e512\": container with ID starting with ef71f887bb1200c2d32a787480394bbee10525ac428cfc47129226ccb7f0e512 not found: ID does not exist" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.725806 4845 scope.go:117] "RemoveContainer" containerID="53c84979698ee17de82612ec4af5b7ca3d7bf150eee48c3cc145fe92fbbfeffe" Oct 06 07:35:46 crc kubenswrapper[4845]: E1006 07:35:46.726166 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53c84979698ee17de82612ec4af5b7ca3d7bf150eee48c3cc145fe92fbbfeffe\": container with ID starting with 53c84979698ee17de82612ec4af5b7ca3d7bf150eee48c3cc145fe92fbbfeffe not found: ID does not exist" containerID="53c84979698ee17de82612ec4af5b7ca3d7bf150eee48c3cc145fe92fbbfeffe" Oct 06 07:35:46 crc kubenswrapper[4845]: I1006 07:35:46.726198 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53c84979698ee17de82612ec4af5b7ca3d7bf150eee48c3cc145fe92fbbfeffe"} err="failed to get container status \"53c84979698ee17de82612ec4af5b7ca3d7bf150eee48c3cc145fe92fbbfeffe\": rpc error: code = NotFound desc = could not find container \"53c84979698ee17de82612ec4af5b7ca3d7bf150eee48c3cc145fe92fbbfeffe\": container with ID starting with 53c84979698ee17de82612ec4af5b7ca3d7bf150eee48c3cc145fe92fbbfeffe not found: ID does not exist" Oct 06 07:35:48 crc kubenswrapper[4845]: I1006 07:35:48.237823 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87313962-820c-4ca5-8a6a-f656bcddc2d7" path="/var/lib/kubelet/pods/87313962-820c-4ca5-8a6a-f656bcddc2d7/volumes" Oct 06 07:35:54 crc kubenswrapper[4845]: I1006 07:35:54.226968 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:35:54 crc kubenswrapper[4845]: E1006 07:35:54.227724 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:36:07 crc kubenswrapper[4845]: I1006 07:36:07.227702 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:36:07 crc kubenswrapper[4845]: E1006 07:36:07.229180 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:36:22 crc kubenswrapper[4845]: I1006 07:36:22.226555 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:36:22 crc kubenswrapper[4845]: E1006 07:36:22.227552 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:36:36 crc kubenswrapper[4845]: I1006 07:36:36.238191 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:36:36 crc kubenswrapper[4845]: E1006 07:36:36.239421 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:36:51 crc kubenswrapper[4845]: I1006 07:36:51.227369 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:36:51 crc kubenswrapper[4845]: E1006 07:36:51.228470 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:37:05 crc kubenswrapper[4845]: I1006 07:37:05.227445 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:37:05 crc kubenswrapper[4845]: E1006 07:37:05.228223 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:37:20 crc kubenswrapper[4845]: I1006 07:37:20.227531 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:37:20 crc kubenswrapper[4845]: E1006 07:37:20.228215 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:37:31 crc kubenswrapper[4845]: I1006 07:37:31.228097 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:37:31 crc kubenswrapper[4845]: E1006 07:37:31.229357 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.560672 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g9l95"] Oct 06 07:37:39 crc kubenswrapper[4845]: E1006 07:37:39.561632 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87313962-820c-4ca5-8a6a-f656bcddc2d7" containerName="registry-server" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.561644 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="87313962-820c-4ca5-8a6a-f656bcddc2d7" containerName="registry-server" Oct 06 07:37:39 crc kubenswrapper[4845]: E1006 07:37:39.561670 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87313962-820c-4ca5-8a6a-f656bcddc2d7" containerName="extract-utilities" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.561677 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="87313962-820c-4ca5-8a6a-f656bcddc2d7" containerName="extract-utilities" Oct 06 07:37:39 crc kubenswrapper[4845]: E1006 07:37:39.561704 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87313962-820c-4ca5-8a6a-f656bcddc2d7" containerName="extract-content" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.561710 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="87313962-820c-4ca5-8a6a-f656bcddc2d7" containerName="extract-content" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.561921 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="87313962-820c-4ca5-8a6a-f656bcddc2d7" containerName="registry-server" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.564660 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.574646 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9l95"] Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.749746 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877af51a-2b76-4c43-bc2f-fc711fedfdda-catalog-content\") pod \"redhat-operators-g9l95\" (UID: \"877af51a-2b76-4c43-bc2f-fc711fedfdda\") " pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.750009 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4nnc\" (UniqueName: \"kubernetes.io/projected/877af51a-2b76-4c43-bc2f-fc711fedfdda-kube-api-access-n4nnc\") pod \"redhat-operators-g9l95\" (UID: \"877af51a-2b76-4c43-bc2f-fc711fedfdda\") " pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.750120 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877af51a-2b76-4c43-bc2f-fc711fedfdda-utilities\") pod \"redhat-operators-g9l95\" (UID: \"877af51a-2b76-4c43-bc2f-fc711fedfdda\") " pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.851951 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877af51a-2b76-4c43-bc2f-fc711fedfdda-catalog-content\") pod \"redhat-operators-g9l95\" (UID: \"877af51a-2b76-4c43-bc2f-fc711fedfdda\") " pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.852161 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4nnc\" (UniqueName: \"kubernetes.io/projected/877af51a-2b76-4c43-bc2f-fc711fedfdda-kube-api-access-n4nnc\") pod \"redhat-operators-g9l95\" (UID: \"877af51a-2b76-4c43-bc2f-fc711fedfdda\") " pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.852209 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877af51a-2b76-4c43-bc2f-fc711fedfdda-utilities\") pod \"redhat-operators-g9l95\" (UID: \"877af51a-2b76-4c43-bc2f-fc711fedfdda\") " pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.852496 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877af51a-2b76-4c43-bc2f-fc711fedfdda-catalog-content\") pod \"redhat-operators-g9l95\" (UID: \"877af51a-2b76-4c43-bc2f-fc711fedfdda\") " pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.853070 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877af51a-2b76-4c43-bc2f-fc711fedfdda-utilities\") pod \"redhat-operators-g9l95\" (UID: \"877af51a-2b76-4c43-bc2f-fc711fedfdda\") " pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.874891 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4nnc\" (UniqueName: \"kubernetes.io/projected/877af51a-2b76-4c43-bc2f-fc711fedfdda-kube-api-access-n4nnc\") pod \"redhat-operators-g9l95\" (UID: \"877af51a-2b76-4c43-bc2f-fc711fedfdda\") " pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:39 crc kubenswrapper[4845]: I1006 07:37:39.923288 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:40 crc kubenswrapper[4845]: I1006 07:37:40.385520 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9l95"] Oct 06 07:37:40 crc kubenswrapper[4845]: W1006 07:37:40.395642 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod877af51a_2b76_4c43_bc2f_fc711fedfdda.slice/crio-c83b4a7cfccee5c1c0ed88c370ce16bac7cf2f29d2ff9027b57800694416f4d0 WatchSource:0}: Error finding container c83b4a7cfccee5c1c0ed88c370ce16bac7cf2f29d2ff9027b57800694416f4d0: Status 404 returned error can't find the container with id c83b4a7cfccee5c1c0ed88c370ce16bac7cf2f29d2ff9027b57800694416f4d0 Oct 06 07:37:40 crc kubenswrapper[4845]: I1006 07:37:40.743315 4845 generic.go:334] "Generic (PLEG): container finished" podID="877af51a-2b76-4c43-bc2f-fc711fedfdda" containerID="0e0f3b8cbf9554ebff2b5dbd925835b169790f9a7b9c151642eb8e998333cb80" exitCode=0 Oct 06 07:37:40 crc kubenswrapper[4845]: I1006 07:37:40.743365 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9l95" event={"ID":"877af51a-2b76-4c43-bc2f-fc711fedfdda","Type":"ContainerDied","Data":"0e0f3b8cbf9554ebff2b5dbd925835b169790f9a7b9c151642eb8e998333cb80"} Oct 06 07:37:40 crc kubenswrapper[4845]: I1006 07:37:40.743420 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9l95" event={"ID":"877af51a-2b76-4c43-bc2f-fc711fedfdda","Type":"ContainerStarted","Data":"c83b4a7cfccee5c1c0ed88c370ce16bac7cf2f29d2ff9027b57800694416f4d0"} Oct 06 07:37:41 crc kubenswrapper[4845]: I1006 07:37:41.752307 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9l95" event={"ID":"877af51a-2b76-4c43-bc2f-fc711fedfdda","Type":"ContainerStarted","Data":"1c43ae77222f201f6fb15799625b9eca385f7824813454c0499ca52bb40a7cbb"} Oct 06 07:37:42 crc kubenswrapper[4845]: I1006 07:37:42.762508 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9l95" event={"ID":"877af51a-2b76-4c43-bc2f-fc711fedfdda","Type":"ContainerDied","Data":"1c43ae77222f201f6fb15799625b9eca385f7824813454c0499ca52bb40a7cbb"} Oct 06 07:37:42 crc kubenswrapper[4845]: I1006 07:37:42.762368 4845 generic.go:334] "Generic (PLEG): container finished" podID="877af51a-2b76-4c43-bc2f-fc711fedfdda" containerID="1c43ae77222f201f6fb15799625b9eca385f7824813454c0499ca52bb40a7cbb" exitCode=0 Oct 06 07:37:43 crc kubenswrapper[4845]: I1006 07:37:43.772945 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9l95" event={"ID":"877af51a-2b76-4c43-bc2f-fc711fedfdda","Type":"ContainerStarted","Data":"905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4"} Oct 06 07:37:43 crc kubenswrapper[4845]: I1006 07:37:43.792104 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g9l95" podStartSLOduration=2.406569349 podStartE2EDuration="4.792089778s" podCreationTimestamp="2025-10-06 07:37:39 +0000 UTC" firstStartedPulling="2025-10-06 07:37:40.758864095 +0000 UTC m=+3145.273605103" lastFinishedPulling="2025-10-06 07:37:43.144384524 +0000 UTC m=+3147.659125532" observedRunningTime="2025-10-06 07:37:43.787195455 +0000 UTC m=+3148.301936463" watchObservedRunningTime="2025-10-06 07:37:43.792089778 +0000 UTC m=+3148.306830786" Oct 06 07:37:44 crc kubenswrapper[4845]: I1006 07:37:44.227079 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:37:44 crc kubenswrapper[4845]: E1006 07:37:44.227521 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:37:49 crc kubenswrapper[4845]: I1006 07:37:49.923991 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:49 crc kubenswrapper[4845]: I1006 07:37:49.924750 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:49 crc kubenswrapper[4845]: I1006 07:37:49.975077 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:50 crc kubenswrapper[4845]: I1006 07:37:50.908257 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:50 crc kubenswrapper[4845]: I1006 07:37:50.954032 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g9l95"] Oct 06 07:37:52 crc kubenswrapper[4845]: I1006 07:37:52.884688 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g9l95" podUID="877af51a-2b76-4c43-bc2f-fc711fedfdda" containerName="registry-server" containerID="cri-o://905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4" gracePeriod=2 Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.389613 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.502304 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877af51a-2b76-4c43-bc2f-fc711fedfdda-catalog-content\") pod \"877af51a-2b76-4c43-bc2f-fc711fedfdda\" (UID: \"877af51a-2b76-4c43-bc2f-fc711fedfdda\") " Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.502778 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4nnc\" (UniqueName: \"kubernetes.io/projected/877af51a-2b76-4c43-bc2f-fc711fedfdda-kube-api-access-n4nnc\") pod \"877af51a-2b76-4c43-bc2f-fc711fedfdda\" (UID: \"877af51a-2b76-4c43-bc2f-fc711fedfdda\") " Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.502879 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877af51a-2b76-4c43-bc2f-fc711fedfdda-utilities\") pod \"877af51a-2b76-4c43-bc2f-fc711fedfdda\" (UID: \"877af51a-2b76-4c43-bc2f-fc711fedfdda\") " Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.503692 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/877af51a-2b76-4c43-bc2f-fc711fedfdda-utilities" (OuterVolumeSpecName: "utilities") pod "877af51a-2b76-4c43-bc2f-fc711fedfdda" (UID: "877af51a-2b76-4c43-bc2f-fc711fedfdda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.515409 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877af51a-2b76-4c43-bc2f-fc711fedfdda-kube-api-access-n4nnc" (OuterVolumeSpecName: "kube-api-access-n4nnc") pod "877af51a-2b76-4c43-bc2f-fc711fedfdda" (UID: "877af51a-2b76-4c43-bc2f-fc711fedfdda"). InnerVolumeSpecName "kube-api-access-n4nnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.579513 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/877af51a-2b76-4c43-bc2f-fc711fedfdda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "877af51a-2b76-4c43-bc2f-fc711fedfdda" (UID: "877af51a-2b76-4c43-bc2f-fc711fedfdda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.605617 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877af51a-2b76-4c43-bc2f-fc711fedfdda-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.605672 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4nnc\" (UniqueName: \"kubernetes.io/projected/877af51a-2b76-4c43-bc2f-fc711fedfdda-kube-api-access-n4nnc\") on node \"crc\" DevicePath \"\"" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.605692 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877af51a-2b76-4c43-bc2f-fc711fedfdda-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.893030 4845 generic.go:334] "Generic (PLEG): container finished" podID="877af51a-2b76-4c43-bc2f-fc711fedfdda" containerID="905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4" exitCode=0 Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.893085 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9l95" event={"ID":"877af51a-2b76-4c43-bc2f-fc711fedfdda","Type":"ContainerDied","Data":"905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4"} Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.893109 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9l95" event={"ID":"877af51a-2b76-4c43-bc2f-fc711fedfdda","Type":"ContainerDied","Data":"c83b4a7cfccee5c1c0ed88c370ce16bac7cf2f29d2ff9027b57800694416f4d0"} Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.893123 4845 scope.go:117] "RemoveContainer" containerID="905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.893090 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9l95" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.910122 4845 scope.go:117] "RemoveContainer" containerID="1c43ae77222f201f6fb15799625b9eca385f7824813454c0499ca52bb40a7cbb" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.928121 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g9l95"] Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.939225 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g9l95"] Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.956433 4845 scope.go:117] "RemoveContainer" containerID="0e0f3b8cbf9554ebff2b5dbd925835b169790f9a7b9c151642eb8e998333cb80" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.977710 4845 scope.go:117] "RemoveContainer" containerID="905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4" Oct 06 07:37:53 crc kubenswrapper[4845]: E1006 07:37:53.978120 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4\": container with ID starting with 905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4 not found: ID does not exist" containerID="905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.978162 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4"} err="failed to get container status \"905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4\": rpc error: code = NotFound desc = could not find container \"905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4\": container with ID starting with 905dcad388daf098a341335f9150ee1f9e222a3ce75a3e18efeb56b1bd5b21c4 not found: ID does not exist" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.978191 4845 scope.go:117] "RemoveContainer" containerID="1c43ae77222f201f6fb15799625b9eca385f7824813454c0499ca52bb40a7cbb" Oct 06 07:37:53 crc kubenswrapper[4845]: E1006 07:37:53.978569 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c43ae77222f201f6fb15799625b9eca385f7824813454c0499ca52bb40a7cbb\": container with ID starting with 1c43ae77222f201f6fb15799625b9eca385f7824813454c0499ca52bb40a7cbb not found: ID does not exist" containerID="1c43ae77222f201f6fb15799625b9eca385f7824813454c0499ca52bb40a7cbb" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.978604 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c43ae77222f201f6fb15799625b9eca385f7824813454c0499ca52bb40a7cbb"} err="failed to get container status \"1c43ae77222f201f6fb15799625b9eca385f7824813454c0499ca52bb40a7cbb\": rpc error: code = NotFound desc = could not find container \"1c43ae77222f201f6fb15799625b9eca385f7824813454c0499ca52bb40a7cbb\": container with ID starting with 1c43ae77222f201f6fb15799625b9eca385f7824813454c0499ca52bb40a7cbb not found: ID does not exist" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.978624 4845 scope.go:117] "RemoveContainer" containerID="0e0f3b8cbf9554ebff2b5dbd925835b169790f9a7b9c151642eb8e998333cb80" Oct 06 07:37:53 crc kubenswrapper[4845]: E1006 07:37:53.978956 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e0f3b8cbf9554ebff2b5dbd925835b169790f9a7b9c151642eb8e998333cb80\": container with ID starting with 0e0f3b8cbf9554ebff2b5dbd925835b169790f9a7b9c151642eb8e998333cb80 not found: ID does not exist" containerID="0e0f3b8cbf9554ebff2b5dbd925835b169790f9a7b9c151642eb8e998333cb80" Oct 06 07:37:53 crc kubenswrapper[4845]: I1006 07:37:53.978985 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e0f3b8cbf9554ebff2b5dbd925835b169790f9a7b9c151642eb8e998333cb80"} err="failed to get container status \"0e0f3b8cbf9554ebff2b5dbd925835b169790f9a7b9c151642eb8e998333cb80\": rpc error: code = NotFound desc = could not find container \"0e0f3b8cbf9554ebff2b5dbd925835b169790f9a7b9c151642eb8e998333cb80\": container with ID starting with 0e0f3b8cbf9554ebff2b5dbd925835b169790f9a7b9c151642eb8e998333cb80 not found: ID does not exist" Oct 06 07:37:54 crc kubenswrapper[4845]: I1006 07:37:54.238901 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877af51a-2b76-4c43-bc2f-fc711fedfdda" path="/var/lib/kubelet/pods/877af51a-2b76-4c43-bc2f-fc711fedfdda/volumes" Oct 06 07:37:57 crc kubenswrapper[4845]: I1006 07:37:57.226743 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:37:57 crc kubenswrapper[4845]: E1006 07:37:57.227465 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:38:08 crc kubenswrapper[4845]: I1006 07:38:08.227587 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:38:08 crc kubenswrapper[4845]: E1006 07:38:08.228438 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:38:20 crc kubenswrapper[4845]: I1006 07:38:20.226449 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:38:20 crc kubenswrapper[4845]: E1006 07:38:20.227166 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:38:33 crc kubenswrapper[4845]: I1006 07:38:33.229247 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:38:34 crc kubenswrapper[4845]: I1006 07:38:34.246093 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"d44e32c11f635c2d507d4f8ff409ad552dde0d564827032df278547fcdec354d"} Oct 06 07:38:40 crc kubenswrapper[4845]: I1006 07:38:40.290346 4845 generic.go:334] "Generic (PLEG): container finished" podID="1fdf5d3a-7d9e-4702-ae15-1373bbd94574" containerID="9772af0f187f9d7bcf36aac88902b0557fdf588742f04828b95ad85e4383d94e" exitCode=0 Oct 06 07:38:40 crc kubenswrapper[4845]: I1006 07:38:40.292414 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1fdf5d3a-7d9e-4702-ae15-1373bbd94574","Type":"ContainerDied","Data":"9772af0f187f9d7bcf36aac88902b0557fdf588742f04828b95ad85e4383d94e"} Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.657248 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.749271 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-config-data\") pod \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.749411 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-test-operator-ephemeral-temporary\") pod \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.749447 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-test-operator-ephemeral-workdir\") pod \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.749477 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-openstack-config-secret\") pod \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.749498 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.749544 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-ssh-key\") pod \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.749610 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-ca-certs\") pod \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.749640 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-openstack-config\") pod \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.749665 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkpd5\" (UniqueName: \"kubernetes.io/projected/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-kube-api-access-gkpd5\") pod \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\" (UID: \"1fdf5d3a-7d9e-4702-ae15-1373bbd94574\") " Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.751727 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-config-data" (OuterVolumeSpecName: "config-data") pod "1fdf5d3a-7d9e-4702-ae15-1373bbd94574" (UID: "1fdf5d3a-7d9e-4702-ae15-1373bbd94574"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.753456 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "1fdf5d3a-7d9e-4702-ae15-1373bbd94574" (UID: "1fdf5d3a-7d9e-4702-ae15-1373bbd94574"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.756163 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-kube-api-access-gkpd5" (OuterVolumeSpecName: "kube-api-access-gkpd5") pod "1fdf5d3a-7d9e-4702-ae15-1373bbd94574" (UID: "1fdf5d3a-7d9e-4702-ae15-1373bbd94574"). InnerVolumeSpecName "kube-api-access-gkpd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.756879 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "1fdf5d3a-7d9e-4702-ae15-1373bbd94574" (UID: "1fdf5d3a-7d9e-4702-ae15-1373bbd94574"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.758082 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "1fdf5d3a-7d9e-4702-ae15-1373bbd94574" (UID: "1fdf5d3a-7d9e-4702-ae15-1373bbd94574"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.778804 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1fdf5d3a-7d9e-4702-ae15-1373bbd94574" (UID: "1fdf5d3a-7d9e-4702-ae15-1373bbd94574"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.781132 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1fdf5d3a-7d9e-4702-ae15-1373bbd94574" (UID: "1fdf5d3a-7d9e-4702-ae15-1373bbd94574"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.783470 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "1fdf5d3a-7d9e-4702-ae15-1373bbd94574" (UID: "1fdf5d3a-7d9e-4702-ae15-1373bbd94574"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.798429 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1fdf5d3a-7d9e-4702-ae15-1373bbd94574" (UID: "1fdf5d3a-7d9e-4702-ae15-1373bbd94574"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.852001 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.852034 4845 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.852049 4845 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.852060 4845 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.852094 4845 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.852105 4845 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.852114 4845 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.852122 4845 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.852131 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkpd5\" (UniqueName: \"kubernetes.io/projected/1fdf5d3a-7d9e-4702-ae15-1373bbd94574-kube-api-access-gkpd5\") on node \"crc\" DevicePath \"\"" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.895439 4845 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 06 07:38:41 crc kubenswrapper[4845]: I1006 07:38:41.954089 4845 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 06 07:38:42 crc kubenswrapper[4845]: I1006 07:38:42.310565 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1fdf5d3a-7d9e-4702-ae15-1373bbd94574","Type":"ContainerDied","Data":"b612fc75e917be070d32dd6a223c0d683826a0bf36c34d1607e36074b2458206"} Oct 06 07:38:42 crc kubenswrapper[4845]: I1006 07:38:42.310686 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b612fc75e917be070d32dd6a223c0d683826a0bf36c34d1607e36074b2458206" Oct 06 07:38:42 crc kubenswrapper[4845]: I1006 07:38:42.310607 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.629609 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 06 07:38:45 crc kubenswrapper[4845]: E1006 07:38:45.630847 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877af51a-2b76-4c43-bc2f-fc711fedfdda" containerName="extract-utilities" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.630865 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="877af51a-2b76-4c43-bc2f-fc711fedfdda" containerName="extract-utilities" Oct 06 07:38:45 crc kubenswrapper[4845]: E1006 07:38:45.630893 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fdf5d3a-7d9e-4702-ae15-1373bbd94574" containerName="tempest-tests-tempest-tests-runner" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.630901 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fdf5d3a-7d9e-4702-ae15-1373bbd94574" containerName="tempest-tests-tempest-tests-runner" Oct 06 07:38:45 crc kubenswrapper[4845]: E1006 07:38:45.630914 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877af51a-2b76-4c43-bc2f-fc711fedfdda" containerName="extract-content" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.630922 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="877af51a-2b76-4c43-bc2f-fc711fedfdda" containerName="extract-content" Oct 06 07:38:45 crc kubenswrapper[4845]: E1006 07:38:45.630970 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877af51a-2b76-4c43-bc2f-fc711fedfdda" containerName="registry-server" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.630982 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="877af51a-2b76-4c43-bc2f-fc711fedfdda" containerName="registry-server" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.631177 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="877af51a-2b76-4c43-bc2f-fc711fedfdda" containerName="registry-server" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.631194 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fdf5d3a-7d9e-4702-ae15-1373bbd94574" containerName="tempest-tests-tempest-tests-runner" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.631910 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.634230 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jqvnt" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.640932 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.731969 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a918bcf8-062d-4a05-8f1b-bc24088f12b7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.732112 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx26h\" (UniqueName: \"kubernetes.io/projected/a918bcf8-062d-4a05-8f1b-bc24088f12b7-kube-api-access-wx26h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a918bcf8-062d-4a05-8f1b-bc24088f12b7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.833634 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a918bcf8-062d-4a05-8f1b-bc24088f12b7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.833774 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx26h\" (UniqueName: \"kubernetes.io/projected/a918bcf8-062d-4a05-8f1b-bc24088f12b7-kube-api-access-wx26h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a918bcf8-062d-4a05-8f1b-bc24088f12b7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.834104 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a918bcf8-062d-4a05-8f1b-bc24088f12b7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.855997 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx26h\" (UniqueName: \"kubernetes.io/projected/a918bcf8-062d-4a05-8f1b-bc24088f12b7-kube-api-access-wx26h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a918bcf8-062d-4a05-8f1b-bc24088f12b7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.859529 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a918bcf8-062d-4a05-8f1b-bc24088f12b7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 07:38:45 crc kubenswrapper[4845]: I1006 07:38:45.966044 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 07:38:46 crc kubenswrapper[4845]: I1006 07:38:46.510759 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 06 07:38:47 crc kubenswrapper[4845]: I1006 07:38:47.369917 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"a918bcf8-062d-4a05-8f1b-bc24088f12b7","Type":"ContainerStarted","Data":"53ca98e4f55d9c9eed0f56fcf8c971d0a20baf4fd20a65ef4baa23d0d165cb36"} Oct 06 07:38:48 crc kubenswrapper[4845]: I1006 07:38:48.386425 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"a918bcf8-062d-4a05-8f1b-bc24088f12b7","Type":"ContainerStarted","Data":"7c7b3178f928e11416a8a34c4db1f914039fc616dde48d2aa4f3289911d159a9"} Oct 06 07:38:48 crc kubenswrapper[4845]: I1006 07:38:48.416944 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.652262053 podStartE2EDuration="3.416921747s" podCreationTimestamp="2025-10-06 07:38:45 +0000 UTC" firstStartedPulling="2025-10-06 07:38:46.525108498 +0000 UTC m=+3211.039849506" lastFinishedPulling="2025-10-06 07:38:47.289768192 +0000 UTC m=+3211.804509200" observedRunningTime="2025-10-06 07:38:48.404324179 +0000 UTC m=+3212.919065197" watchObservedRunningTime="2025-10-06 07:38:48.416921747 +0000 UTC m=+3212.931662765" Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.132030 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xcvm4/must-gather-5rnl2"] Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.135057 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/must-gather-5rnl2" Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.139302 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xcvm4"/"default-dockercfg-wjh9w" Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.139752 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xcvm4"/"openshift-service-ca.crt" Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.146198 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xcvm4"/"kube-root-ca.crt" Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.150620 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xcvm4/must-gather-5rnl2"] Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.305353 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw249\" (UniqueName: \"kubernetes.io/projected/c57034cd-1f95-4d62-9877-ef39b4ca9e86-kube-api-access-tw249\") pod \"must-gather-5rnl2\" (UID: \"c57034cd-1f95-4d62-9877-ef39b4ca9e86\") " pod="openshift-must-gather-xcvm4/must-gather-5rnl2" Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.305573 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c57034cd-1f95-4d62-9877-ef39b4ca9e86-must-gather-output\") pod \"must-gather-5rnl2\" (UID: \"c57034cd-1f95-4d62-9877-ef39b4ca9e86\") " pod="openshift-must-gather-xcvm4/must-gather-5rnl2" Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.407206 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw249\" (UniqueName: \"kubernetes.io/projected/c57034cd-1f95-4d62-9877-ef39b4ca9e86-kube-api-access-tw249\") pod \"must-gather-5rnl2\" (UID: \"c57034cd-1f95-4d62-9877-ef39b4ca9e86\") " pod="openshift-must-gather-xcvm4/must-gather-5rnl2" Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.407726 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c57034cd-1f95-4d62-9877-ef39b4ca9e86-must-gather-output\") pod \"must-gather-5rnl2\" (UID: \"c57034cd-1f95-4d62-9877-ef39b4ca9e86\") " pod="openshift-must-gather-xcvm4/must-gather-5rnl2" Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.408113 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c57034cd-1f95-4d62-9877-ef39b4ca9e86-must-gather-output\") pod \"must-gather-5rnl2\" (UID: \"c57034cd-1f95-4d62-9877-ef39b4ca9e86\") " pod="openshift-must-gather-xcvm4/must-gather-5rnl2" Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.428914 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw249\" (UniqueName: \"kubernetes.io/projected/c57034cd-1f95-4d62-9877-ef39b4ca9e86-kube-api-access-tw249\") pod \"must-gather-5rnl2\" (UID: \"c57034cd-1f95-4d62-9877-ef39b4ca9e86\") " pod="openshift-must-gather-xcvm4/must-gather-5rnl2" Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.457927 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/must-gather-5rnl2" Oct 06 07:39:04 crc kubenswrapper[4845]: I1006 07:39:04.903563 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xcvm4/must-gather-5rnl2"] Oct 06 07:39:04 crc kubenswrapper[4845]: W1006 07:39:04.923661 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc57034cd_1f95_4d62_9877_ef39b4ca9e86.slice/crio-a9bd82d3b982d75a7b4699977cd2143cc3091b59208d05611d86b15d9abb4602 WatchSource:0}: Error finding container a9bd82d3b982d75a7b4699977cd2143cc3091b59208d05611d86b15d9abb4602: Status 404 returned error can't find the container with id a9bd82d3b982d75a7b4699977cd2143cc3091b59208d05611d86b15d9abb4602 Oct 06 07:39:05 crc kubenswrapper[4845]: I1006 07:39:05.562949 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/must-gather-5rnl2" event={"ID":"c57034cd-1f95-4d62-9877-ef39b4ca9e86","Type":"ContainerStarted","Data":"a9bd82d3b982d75a7b4699977cd2143cc3091b59208d05611d86b15d9abb4602"} Oct 06 07:39:09 crc kubenswrapper[4845]: I1006 07:39:09.601086 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/must-gather-5rnl2" event={"ID":"c57034cd-1f95-4d62-9877-ef39b4ca9e86","Type":"ContainerStarted","Data":"b3efd07ef8de7479182c74741171acb3d2bb590c32ad0ba373d36a3c140e29ec"} Oct 06 07:39:09 crc kubenswrapper[4845]: I1006 07:39:09.601577 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/must-gather-5rnl2" event={"ID":"c57034cd-1f95-4d62-9877-ef39b4ca9e86","Type":"ContainerStarted","Data":"f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035"} Oct 06 07:39:09 crc kubenswrapper[4845]: I1006 07:39:09.636232 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xcvm4/must-gather-5rnl2" podStartSLOduration=2.132402954 podStartE2EDuration="5.636210055s" podCreationTimestamp="2025-10-06 07:39:04 +0000 UTC" firstStartedPulling="2025-10-06 07:39:04.933508439 +0000 UTC m=+3229.448249447" lastFinishedPulling="2025-10-06 07:39:08.43731554 +0000 UTC m=+3232.952056548" observedRunningTime="2025-10-06 07:39:09.6221414 +0000 UTC m=+3234.136882408" watchObservedRunningTime="2025-10-06 07:39:09.636210055 +0000 UTC m=+3234.150951063" Oct 06 07:39:12 crc kubenswrapper[4845]: I1006 07:39:12.364738 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xcvm4/crc-debug-qlzm9"] Oct 06 07:39:12 crc kubenswrapper[4845]: I1006 07:39:12.367803 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" Oct 06 07:39:12 crc kubenswrapper[4845]: I1006 07:39:12.410540 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ad4472d-e587-49e6-9a07-d678cf0e904b-host\") pod \"crc-debug-qlzm9\" (UID: \"5ad4472d-e587-49e6-9a07-d678cf0e904b\") " pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" Oct 06 07:39:12 crc kubenswrapper[4845]: I1006 07:39:12.410592 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjqth\" (UniqueName: \"kubernetes.io/projected/5ad4472d-e587-49e6-9a07-d678cf0e904b-kube-api-access-sjqth\") pod \"crc-debug-qlzm9\" (UID: \"5ad4472d-e587-49e6-9a07-d678cf0e904b\") " pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" Oct 06 07:39:12 crc kubenswrapper[4845]: I1006 07:39:12.511817 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ad4472d-e587-49e6-9a07-d678cf0e904b-host\") pod \"crc-debug-qlzm9\" (UID: \"5ad4472d-e587-49e6-9a07-d678cf0e904b\") " pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" Oct 06 07:39:12 crc kubenswrapper[4845]: I1006 07:39:12.511882 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjqth\" (UniqueName: \"kubernetes.io/projected/5ad4472d-e587-49e6-9a07-d678cf0e904b-kube-api-access-sjqth\") pod \"crc-debug-qlzm9\" (UID: \"5ad4472d-e587-49e6-9a07-d678cf0e904b\") " pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" Oct 06 07:39:12 crc kubenswrapper[4845]: I1006 07:39:12.511957 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ad4472d-e587-49e6-9a07-d678cf0e904b-host\") pod \"crc-debug-qlzm9\" (UID: \"5ad4472d-e587-49e6-9a07-d678cf0e904b\") " pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" Oct 06 07:39:12 crc kubenswrapper[4845]: I1006 07:39:12.532333 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjqth\" (UniqueName: \"kubernetes.io/projected/5ad4472d-e587-49e6-9a07-d678cf0e904b-kube-api-access-sjqth\") pod \"crc-debug-qlzm9\" (UID: \"5ad4472d-e587-49e6-9a07-d678cf0e904b\") " pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" Oct 06 07:39:12 crc kubenswrapper[4845]: I1006 07:39:12.690064 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" Oct 06 07:39:13 crc kubenswrapper[4845]: I1006 07:39:13.645768 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" event={"ID":"5ad4472d-e587-49e6-9a07-d678cf0e904b","Type":"ContainerStarted","Data":"5056ed37737be500fdc96d998c7b5f9967ee1018f6268e275f8113e82386adeb"} Oct 06 07:39:24 crc kubenswrapper[4845]: I1006 07:39:24.772545 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" event={"ID":"5ad4472d-e587-49e6-9a07-d678cf0e904b","Type":"ContainerStarted","Data":"3dc7981dec70b344ce8c0ed5c0775b7e8291db74528f1e658128f8ed00769817"} Oct 06 07:39:24 crc kubenswrapper[4845]: I1006 07:39:24.791033 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" podStartSLOduration=1.9195367129999998 podStartE2EDuration="12.791015184s" podCreationTimestamp="2025-10-06 07:39:12 +0000 UTC" firstStartedPulling="2025-10-06 07:39:12.735149155 +0000 UTC m=+3237.249890163" lastFinishedPulling="2025-10-06 07:39:23.606627626 +0000 UTC m=+3248.121368634" observedRunningTime="2025-10-06 07:39:24.78925978 +0000 UTC m=+3249.304000788" watchObservedRunningTime="2025-10-06 07:39:24.791015184 +0000 UTC m=+3249.305756192" Oct 06 07:40:10 crc kubenswrapper[4845]: I1006 07:40:10.701623 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56f94c67bb-rmg7r_d82278b6-977b-40db-b925-10f8d7621e7c/barbican-api/0.log" Oct 06 07:40:10 crc kubenswrapper[4845]: I1006 07:40:10.702084 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56f94c67bb-rmg7r_d82278b6-977b-40db-b925-10f8d7621e7c/barbican-api-log/0.log" Oct 06 07:40:10 crc kubenswrapper[4845]: I1006 07:40:10.877693 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c5976ff76-gcfvw_3b247c0d-1911-47a7-83bc-fad6ee6d6172/barbican-keystone-listener/0.log" Oct 06 07:40:10 crc kubenswrapper[4845]: I1006 07:40:10.957771 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c5976ff76-gcfvw_3b247c0d-1911-47a7-83bc-fad6ee6d6172/barbican-keystone-listener-log/0.log" Oct 06 07:40:11 crc kubenswrapper[4845]: I1006 07:40:11.104663 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d85949b65-6bf6r_a8cd370d-1327-4f32-a12d-e43c99f63f23/barbican-worker/0.log" Oct 06 07:40:11 crc kubenswrapper[4845]: I1006 07:40:11.126325 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d85949b65-6bf6r_a8cd370d-1327-4f32-a12d-e43c99f63f23/barbican-worker-log/0.log" Oct 06 07:40:11 crc kubenswrapper[4845]: I1006 07:40:11.307127 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4_5ca7a9ec-05b1-46ae-bc84-065bf4904784/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:11 crc kubenswrapper[4845]: I1006 07:40:11.490211 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_66468bd9-d0ea-4117-a963-3e7fb9b3c54d/ceilometer-central-agent/0.log" Oct 06 07:40:11 crc kubenswrapper[4845]: I1006 07:40:11.537152 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_66468bd9-d0ea-4117-a963-3e7fb9b3c54d/ceilometer-notification-agent/0.log" Oct 06 07:40:11 crc kubenswrapper[4845]: I1006 07:40:11.568872 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_66468bd9-d0ea-4117-a963-3e7fb9b3c54d/proxy-httpd/0.log" Oct 06 07:40:11 crc kubenswrapper[4845]: I1006 07:40:11.672010 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_66468bd9-d0ea-4117-a963-3e7fb9b3c54d/sg-core/0.log" Oct 06 07:40:11 crc kubenswrapper[4845]: I1006 07:40:11.792544 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0cc2b546-3f23-4c16-af0c-84cce0997fe9/cinder-api/0.log" Oct 06 07:40:11 crc kubenswrapper[4845]: I1006 07:40:11.877211 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0cc2b546-3f23-4c16-af0c-84cce0997fe9/cinder-api-log/0.log" Oct 06 07:40:12 crc kubenswrapper[4845]: I1006 07:40:12.032974 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d9789203-7142-4ed7-b8db-7105d5233557/cinder-scheduler/0.log" Oct 06 07:40:12 crc kubenswrapper[4845]: I1006 07:40:12.085060 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d9789203-7142-4ed7-b8db-7105d5233557/probe/0.log" Oct 06 07:40:12 crc kubenswrapper[4845]: I1006 07:40:12.228324 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9_4e569fb2-9612-43bd-93ab-bfad8fc42c9c/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:12 crc kubenswrapper[4845]: I1006 07:40:12.389796 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-h4msk_57c9565a-619a-48cd-af5f-1dc8141f82af/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:12 crc kubenswrapper[4845]: I1006 07:40:12.498395 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-tg64c_21b056d4-86a9-4bdc-a052-8cea0b28efac/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:12 crc kubenswrapper[4845]: I1006 07:40:12.663764 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55565d6cbc-hlnmz_f37a2a86-a24c-4fa6-9944-ba02bf209e32/init/0.log" Oct 06 07:40:12 crc kubenswrapper[4845]: I1006 07:40:12.887616 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55565d6cbc-hlnmz_f37a2a86-a24c-4fa6-9944-ba02bf209e32/init/0.log" Oct 06 07:40:12 crc kubenswrapper[4845]: I1006 07:40:12.937687 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55565d6cbc-hlnmz_f37a2a86-a24c-4fa6-9944-ba02bf209e32/dnsmasq-dns/0.log" Oct 06 07:40:13 crc kubenswrapper[4845]: I1006 07:40:13.080719 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj_e0ba0e05-c816-48c7-9d88-a735ea82f3eb/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:13 crc kubenswrapper[4845]: I1006 07:40:13.158750 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_07e1706a-220a-4291-b2b3-1b79660ec95b/glance-httpd/0.log" Oct 06 07:40:13 crc kubenswrapper[4845]: I1006 07:40:13.261271 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_07e1706a-220a-4291-b2b3-1b79660ec95b/glance-log/0.log" Oct 06 07:40:13 crc kubenswrapper[4845]: I1006 07:40:13.405799 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_19ec12dd-d7b0-45e8-b569-887bbdf5b6fd/glance-httpd/0.log" Oct 06 07:40:13 crc kubenswrapper[4845]: I1006 07:40:13.457286 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_19ec12dd-d7b0-45e8-b569-887bbdf5b6fd/glance-log/0.log" Oct 06 07:40:13 crc kubenswrapper[4845]: I1006 07:40:13.633479 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gm46p_705e6ad6-f299-43dc-8d30-7c0bd5039250/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:13 crc kubenswrapper[4845]: I1006 07:40:13.791172 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hb7dv_a9079881-df95-4fe0-a6db-2f085d6d974e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:14 crc kubenswrapper[4845]: I1006 07:40:14.017582 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_0dfda52e-f351-49b0-93b6-e95ce8146051/kube-state-metrics/0.log" Oct 06 07:40:14 crc kubenswrapper[4845]: I1006 07:40:14.039029 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-67cb66d46f-6rxvh_6427f38b-494b-4cd7-b019-aa8db716ffe0/keystone-api/0.log" Oct 06 07:40:14 crc kubenswrapper[4845]: I1006 07:40:14.099434 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kj28v_50be375c-cf6d-4540-930c-e09f602c4045/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:14 crc kubenswrapper[4845]: I1006 07:40:14.478075 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767548595f-nndsw_f554586f-3f7f-4fe0-9a1b-0ff75662c2e2/neutron-api/0.log" Oct 06 07:40:14 crc kubenswrapper[4845]: I1006 07:40:14.581405 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767548595f-nndsw_f554586f-3f7f-4fe0-9a1b-0ff75662c2e2/neutron-httpd/0.log" Oct 06 07:40:14 crc kubenswrapper[4845]: I1006 07:40:14.679793 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh_60f68944-a123-4f0a-ba3f-8215bf68a123/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:15 crc kubenswrapper[4845]: I1006 07:40:15.215285 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_20195d0c-d1c3-476e-86fa-2bc4d2ab39d3/nova-api-log/0.log" Oct 06 07:40:15 crc kubenswrapper[4845]: I1006 07:40:15.324966 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_20195d0c-d1c3-476e-86fa-2bc4d2ab39d3/nova-api-api/0.log" Oct 06 07:40:15 crc kubenswrapper[4845]: I1006 07:40:15.448684 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be/nova-cell0-conductor-conductor/0.log" Oct 06 07:40:15 crc kubenswrapper[4845]: I1006 07:40:15.684414 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_db6ea194-3e38-44a9-9ac4-0182b588cee2/nova-cell1-conductor-conductor/0.log" Oct 06 07:40:15 crc kubenswrapper[4845]: I1006 07:40:15.756622 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c326f85b-5b04-4ff0-a0e4-29a1e11eefb2/nova-cell1-novncproxy-novncproxy/0.log" Oct 06 07:40:15 crc kubenswrapper[4845]: I1006 07:40:15.990439 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gftcx_6f89fdcf-abd7-4cf4-aa6f-a05ada603477/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:16 crc kubenswrapper[4845]: I1006 07:40:16.150297 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1fe4c060-2cea-4178-a1ec-33cf60f56ef8/nova-metadata-log/0.log" Oct 06 07:40:16 crc kubenswrapper[4845]: I1006 07:40:16.598998 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b3d70a5b-fbdb-4d75-bc33-6fef87a933c6/nova-scheduler-scheduler/0.log" Oct 06 07:40:16 crc kubenswrapper[4845]: I1006 07:40:16.811457 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_441b3c5d-0205-472b-8356-e10a4b5b3a4a/mysql-bootstrap/0.log" Oct 06 07:40:17 crc kubenswrapper[4845]: I1006 07:40:17.014874 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_441b3c5d-0205-472b-8356-e10a4b5b3a4a/mysql-bootstrap/0.log" Oct 06 07:40:17 crc kubenswrapper[4845]: I1006 07:40:17.045805 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_441b3c5d-0205-472b-8356-e10a4b5b3a4a/galera/0.log" Oct 06 07:40:17 crc kubenswrapper[4845]: I1006 07:40:17.291191 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8d5452f2-c63d-4287-93c4-17b89651a7c1/mysql-bootstrap/0.log" Oct 06 07:40:17 crc kubenswrapper[4845]: I1006 07:40:17.343775 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1fe4c060-2cea-4178-a1ec-33cf60f56ef8/nova-metadata-metadata/0.log" Oct 06 07:40:17 crc kubenswrapper[4845]: I1006 07:40:17.650699 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8d5452f2-c63d-4287-93c4-17b89651a7c1/mysql-bootstrap/0.log" Oct 06 07:40:17 crc kubenswrapper[4845]: I1006 07:40:17.655972 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8d5452f2-c63d-4287-93c4-17b89651a7c1/galera/0.log" Oct 06 07:40:18 crc kubenswrapper[4845]: I1006 07:40:18.083522 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ec118969-bd05-449c-bb6b-a460bda1b79a/openstackclient/0.log" Oct 06 07:40:18 crc kubenswrapper[4845]: I1006 07:40:18.115763 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4qbxx_175127a7-9d27-4976-a4bb-789072f8370c/openstack-network-exporter/0.log" Oct 06 07:40:18 crc kubenswrapper[4845]: I1006 07:40:18.338057 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-n9jwg_1e7e45f8-ca4d-473e-9c7e-12bb2626a080/ovsdb-server-init/0.log" Oct 06 07:40:18 crc kubenswrapper[4845]: I1006 07:40:18.575161 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-n9jwg_1e7e45f8-ca4d-473e-9c7e-12bb2626a080/ovsdb-server-init/0.log" Oct 06 07:40:18 crc kubenswrapper[4845]: I1006 07:40:18.629828 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-n9jwg_1e7e45f8-ca4d-473e-9c7e-12bb2626a080/ovsdb-server/0.log" Oct 06 07:40:18 crc kubenswrapper[4845]: I1006 07:40:18.678573 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-n9jwg_1e7e45f8-ca4d-473e-9c7e-12bb2626a080/ovs-vswitchd/0.log" Oct 06 07:40:18 crc kubenswrapper[4845]: I1006 07:40:18.840988 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-v4zd6_e2ee0908-39a9-4303-aad3-040a922d20a7/ovn-controller/0.log" Oct 06 07:40:19 crc kubenswrapper[4845]: I1006 07:40:19.102799 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fqfdd_c1428fc5-6ae1-4387-9635-69c26981be2a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:19 crc kubenswrapper[4845]: I1006 07:40:19.277268 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf5f7ffb-f69e-40fe-b8b7-157266325c88/openstack-network-exporter/0.log" Oct 06 07:40:19 crc kubenswrapper[4845]: I1006 07:40:19.415342 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf5f7ffb-f69e-40fe-b8b7-157266325c88/ovn-northd/0.log" Oct 06 07:40:19 crc kubenswrapper[4845]: I1006 07:40:19.576504 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3dad007b-9982-4f85-842c-083964cd2734/openstack-network-exporter/0.log" Oct 06 07:40:19 crc kubenswrapper[4845]: I1006 07:40:19.597524 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3dad007b-9982-4f85-842c-083964cd2734/ovsdbserver-nb/0.log" Oct 06 07:40:19 crc kubenswrapper[4845]: I1006 07:40:19.871327 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_952ffa29-f400-4b01-a4b7-282a401db753/ovsdbserver-sb/0.log" Oct 06 07:40:19 crc kubenswrapper[4845]: I1006 07:40:19.872252 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_952ffa29-f400-4b01-a4b7-282a401db753/openstack-network-exporter/0.log" Oct 06 07:40:20 crc kubenswrapper[4845]: I1006 07:40:20.125672 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8469844cbb-s9qws_0921c6da-fb36-4acf-b978-252f370ccc30/placement-api/0.log" Oct 06 07:40:20 crc kubenswrapper[4845]: I1006 07:40:20.203434 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8469844cbb-s9qws_0921c6da-fb36-4acf-b978-252f370ccc30/placement-log/0.log" Oct 06 07:40:20 crc kubenswrapper[4845]: I1006 07:40:20.382457 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_28832f5d-962f-4eef-8903-ab5061b80102/setup-container/0.log" Oct 06 07:40:20 crc kubenswrapper[4845]: I1006 07:40:20.716905 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_28832f5d-962f-4eef-8903-ab5061b80102/setup-container/0.log" Oct 06 07:40:20 crc kubenswrapper[4845]: I1006 07:40:20.803816 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_28832f5d-962f-4eef-8903-ab5061b80102/rabbitmq/0.log" Oct 06 07:40:20 crc kubenswrapper[4845]: I1006 07:40:20.982237 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c295b190-30fe-47c3-ae27-c6b809bbe058/setup-container/0.log" Oct 06 07:40:21 crc kubenswrapper[4845]: I1006 07:40:21.141946 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c295b190-30fe-47c3-ae27-c6b809bbe058/rabbitmq/0.log" Oct 06 07:40:21 crc kubenswrapper[4845]: I1006 07:40:21.205235 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c295b190-30fe-47c3-ae27-c6b809bbe058/setup-container/0.log" Oct 06 07:40:21 crc kubenswrapper[4845]: I1006 07:40:21.333014 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw_bc1542af-e2ed-4aed-b0e9-0854b00c1320/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:21 crc kubenswrapper[4845]: I1006 07:40:21.480358 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-dvkwd_88581a77-2703-438c-a5d0-e6972d815990/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:21 crc kubenswrapper[4845]: I1006 07:40:21.706106 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg_4358b521-4e22-42e2-9844-79612bf845b8/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:21 crc kubenswrapper[4845]: I1006 07:40:21.775990 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bcz6n_88aa0357-70ea-4f4d-80e7-952615d772fe/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:21 crc kubenswrapper[4845]: I1006 07:40:21.954939 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dmcpx_c482754d-cc4d-4480-b2c3-1ae079c9222b/ssh-known-hosts-edpm-deployment/0.log" Oct 06 07:40:22 crc kubenswrapper[4845]: I1006 07:40:22.196601 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-865c95569-jxblm_f3a28a2d-4deb-408d-b47b-600758782cdf/proxy-server/0.log" Oct 06 07:40:22 crc kubenswrapper[4845]: I1006 07:40:22.252762 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-865c95569-jxblm_f3a28a2d-4deb-408d-b47b-600758782cdf/proxy-httpd/0.log" Oct 06 07:40:22 crc kubenswrapper[4845]: I1006 07:40:22.451872 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-86cb9_955c6fb6-c2a5-48cd-8680-632f32157e5c/swift-ring-rebalance/0.log" Oct 06 07:40:22 crc kubenswrapper[4845]: I1006 07:40:22.503041 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/account-auditor/0.log" Oct 06 07:40:22 crc kubenswrapper[4845]: I1006 07:40:22.699422 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/account-server/0.log" Oct 06 07:40:22 crc kubenswrapper[4845]: I1006 07:40:22.704041 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/account-reaper/0.log" Oct 06 07:40:22 crc kubenswrapper[4845]: I1006 07:40:22.704230 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/account-replicator/0.log" Oct 06 07:40:22 crc kubenswrapper[4845]: I1006 07:40:22.893842 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/container-auditor/0.log" Oct 06 07:40:22 crc kubenswrapper[4845]: I1006 07:40:22.941069 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/container-server/0.log" Oct 06 07:40:23 crc kubenswrapper[4845]: I1006 07:40:23.002549 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/container-replicator/0.log" Oct 06 07:40:23 crc kubenswrapper[4845]: I1006 07:40:23.111210 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/container-updater/0.log" Oct 06 07:40:23 crc kubenswrapper[4845]: I1006 07:40:23.197619 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/object-auditor/0.log" Oct 06 07:40:23 crc kubenswrapper[4845]: I1006 07:40:23.245760 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/object-expirer/0.log" Oct 06 07:40:23 crc kubenswrapper[4845]: I1006 07:40:23.338451 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/object-replicator/0.log" Oct 06 07:40:23 crc kubenswrapper[4845]: I1006 07:40:23.397608 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/object-server/0.log" Oct 06 07:40:23 crc kubenswrapper[4845]: I1006 07:40:23.413287 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/object-updater/0.log" Oct 06 07:40:23 crc kubenswrapper[4845]: I1006 07:40:23.532068 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/rsync/0.log" Oct 06 07:40:23 crc kubenswrapper[4845]: I1006 07:40:23.616620 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/swift-recon-cron/0.log" Oct 06 07:40:23 crc kubenswrapper[4845]: I1006 07:40:23.922712 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9_81f752a1-1104-4b71-9a89-1c3961584f6f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:24 crc kubenswrapper[4845]: I1006 07:40:24.043583 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_1fdf5d3a-7d9e-4702-ae15-1373bbd94574/tempest-tests-tempest-tests-runner/0.log" Oct 06 07:40:24 crc kubenswrapper[4845]: I1006 07:40:24.254098 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_a918bcf8-062d-4a05-8f1b-bc24088f12b7/test-operator-logs-container/0.log" Oct 06 07:40:24 crc kubenswrapper[4845]: I1006 07:40:24.451850 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-qchpk_f9e93bbd-d62e-451f-bea9-c6a926c912a6/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:40:28 crc kubenswrapper[4845]: I1006 07:40:28.421074 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_08494b3a-f49b-49af-8e06-df5f4fac3171/memcached/0.log" Oct 06 07:40:53 crc kubenswrapper[4845]: I1006 07:40:53.018538 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:40:53 crc kubenswrapper[4845]: I1006 07:40:53.019012 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:41:13 crc kubenswrapper[4845]: I1006 07:41:13.767945 4845 generic.go:334] "Generic (PLEG): container finished" podID="5ad4472d-e587-49e6-9a07-d678cf0e904b" containerID="3dc7981dec70b344ce8c0ed5c0775b7e8291db74528f1e658128f8ed00769817" exitCode=0 Oct 06 07:41:13 crc kubenswrapper[4845]: I1006 07:41:13.768041 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" event={"ID":"5ad4472d-e587-49e6-9a07-d678cf0e904b","Type":"ContainerDied","Data":"3dc7981dec70b344ce8c0ed5c0775b7e8291db74528f1e658128f8ed00769817"} Oct 06 07:41:14 crc kubenswrapper[4845]: I1006 07:41:14.892097 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" Oct 06 07:41:14 crc kubenswrapper[4845]: I1006 07:41:14.928095 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xcvm4/crc-debug-qlzm9"] Oct 06 07:41:14 crc kubenswrapper[4845]: I1006 07:41:14.936122 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xcvm4/crc-debug-qlzm9"] Oct 06 07:41:14 crc kubenswrapper[4845]: I1006 07:41:14.941194 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjqth\" (UniqueName: \"kubernetes.io/projected/5ad4472d-e587-49e6-9a07-d678cf0e904b-kube-api-access-sjqth\") pod \"5ad4472d-e587-49e6-9a07-d678cf0e904b\" (UID: \"5ad4472d-e587-49e6-9a07-d678cf0e904b\") " Oct 06 07:41:14 crc kubenswrapper[4845]: I1006 07:41:14.941226 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ad4472d-e587-49e6-9a07-d678cf0e904b-host\") pod \"5ad4472d-e587-49e6-9a07-d678cf0e904b\" (UID: \"5ad4472d-e587-49e6-9a07-d678cf0e904b\") " Oct 06 07:41:14 crc kubenswrapper[4845]: I1006 07:41:14.941439 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ad4472d-e587-49e6-9a07-d678cf0e904b-host" (OuterVolumeSpecName: "host") pod "5ad4472d-e587-49e6-9a07-d678cf0e904b" (UID: "5ad4472d-e587-49e6-9a07-d678cf0e904b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:41:14 crc kubenswrapper[4845]: I1006 07:41:14.941796 4845 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ad4472d-e587-49e6-9a07-d678cf0e904b-host\") on node \"crc\" DevicePath \"\"" Oct 06 07:41:14 crc kubenswrapper[4845]: I1006 07:41:14.946272 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ad4472d-e587-49e6-9a07-d678cf0e904b-kube-api-access-sjqth" (OuterVolumeSpecName: "kube-api-access-sjqth") pod "5ad4472d-e587-49e6-9a07-d678cf0e904b" (UID: "5ad4472d-e587-49e6-9a07-d678cf0e904b"). InnerVolumeSpecName "kube-api-access-sjqth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:41:15 crc kubenswrapper[4845]: I1006 07:41:15.043958 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjqth\" (UniqueName: \"kubernetes.io/projected/5ad4472d-e587-49e6-9a07-d678cf0e904b-kube-api-access-sjqth\") on node \"crc\" DevicePath \"\"" Oct 06 07:41:15 crc kubenswrapper[4845]: I1006 07:41:15.788364 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5056ed37737be500fdc96d998c7b5f9967ee1018f6268e275f8113e82386adeb" Oct 06 07:41:15 crc kubenswrapper[4845]: I1006 07:41:15.788468 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/crc-debug-qlzm9" Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.129657 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xcvm4/crc-debug-cx2s7"] Oct 06 07:41:16 crc kubenswrapper[4845]: E1006 07:41:16.130085 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ad4472d-e587-49e6-9a07-d678cf0e904b" containerName="container-00" Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.130097 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad4472d-e587-49e6-9a07-d678cf0e904b" containerName="container-00" Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.130434 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ad4472d-e587-49e6-9a07-d678cf0e904b" containerName="container-00" Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.131243 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.260840 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ad4472d-e587-49e6-9a07-d678cf0e904b" path="/var/lib/kubelet/pods/5ad4472d-e587-49e6-9a07-d678cf0e904b/volumes" Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.267978 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2eed0200-c615-4229-9111-17b49ef48812-host\") pod \"crc-debug-cx2s7\" (UID: \"2eed0200-c615-4229-9111-17b49ef48812\") " pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.268076 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c6qk\" (UniqueName: \"kubernetes.io/projected/2eed0200-c615-4229-9111-17b49ef48812-kube-api-access-5c6qk\") pod \"crc-debug-cx2s7\" (UID: \"2eed0200-c615-4229-9111-17b49ef48812\") " pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.370212 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2eed0200-c615-4229-9111-17b49ef48812-host\") pod \"crc-debug-cx2s7\" (UID: \"2eed0200-c615-4229-9111-17b49ef48812\") " pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.370317 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c6qk\" (UniqueName: \"kubernetes.io/projected/2eed0200-c615-4229-9111-17b49ef48812-kube-api-access-5c6qk\") pod \"crc-debug-cx2s7\" (UID: \"2eed0200-c615-4229-9111-17b49ef48812\") " pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.370361 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2eed0200-c615-4229-9111-17b49ef48812-host\") pod \"crc-debug-cx2s7\" (UID: \"2eed0200-c615-4229-9111-17b49ef48812\") " pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.391529 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c6qk\" (UniqueName: \"kubernetes.io/projected/2eed0200-c615-4229-9111-17b49ef48812-kube-api-access-5c6qk\") pod \"crc-debug-cx2s7\" (UID: \"2eed0200-c615-4229-9111-17b49ef48812\") " pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.449972 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" Oct 06 07:41:16 crc kubenswrapper[4845]: W1006 07:41:16.478395 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eed0200_c615_4229_9111_17b49ef48812.slice/crio-a17b6072bdd9fa8f225e42d1c315a8e16d71402b2a425d56c9a33621ac5bb62c WatchSource:0}: Error finding container a17b6072bdd9fa8f225e42d1c315a8e16d71402b2a425d56c9a33621ac5bb62c: Status 404 returned error can't find the container with id a17b6072bdd9fa8f225e42d1c315a8e16d71402b2a425d56c9a33621ac5bb62c Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.799891 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" event={"ID":"2eed0200-c615-4229-9111-17b49ef48812","Type":"ContainerStarted","Data":"22244d55004bbb4d5a058c2a963d112cd372c9d5b213a9b38f5775ece70666e8"} Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.800320 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" event={"ID":"2eed0200-c615-4229-9111-17b49ef48812","Type":"ContainerStarted","Data":"a17b6072bdd9fa8f225e42d1c315a8e16d71402b2a425d56c9a33621ac5bb62c"} Oct 06 07:41:16 crc kubenswrapper[4845]: I1006 07:41:16.818145 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" podStartSLOduration=0.818126701 podStartE2EDuration="818.126701ms" podCreationTimestamp="2025-10-06 07:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:41:16.811025262 +0000 UTC m=+3361.325766290" watchObservedRunningTime="2025-10-06 07:41:16.818126701 +0000 UTC m=+3361.332867709" Oct 06 07:41:17 crc kubenswrapper[4845]: I1006 07:41:17.812504 4845 generic.go:334] "Generic (PLEG): container finished" podID="2eed0200-c615-4229-9111-17b49ef48812" containerID="22244d55004bbb4d5a058c2a963d112cd372c9d5b213a9b38f5775ece70666e8" exitCode=0 Oct 06 07:41:17 crc kubenswrapper[4845]: I1006 07:41:17.812584 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" event={"ID":"2eed0200-c615-4229-9111-17b49ef48812","Type":"ContainerDied","Data":"22244d55004bbb4d5a058c2a963d112cd372c9d5b213a9b38f5775ece70666e8"} Oct 06 07:41:18 crc kubenswrapper[4845]: I1006 07:41:18.916975 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" Oct 06 07:41:19 crc kubenswrapper[4845]: I1006 07:41:19.014042 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2eed0200-c615-4229-9111-17b49ef48812-host\") pod \"2eed0200-c615-4229-9111-17b49ef48812\" (UID: \"2eed0200-c615-4229-9111-17b49ef48812\") " Oct 06 07:41:19 crc kubenswrapper[4845]: I1006 07:41:19.014126 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c6qk\" (UniqueName: \"kubernetes.io/projected/2eed0200-c615-4229-9111-17b49ef48812-kube-api-access-5c6qk\") pod \"2eed0200-c615-4229-9111-17b49ef48812\" (UID: \"2eed0200-c615-4229-9111-17b49ef48812\") " Oct 06 07:41:19 crc kubenswrapper[4845]: I1006 07:41:19.014723 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2eed0200-c615-4229-9111-17b49ef48812-host" (OuterVolumeSpecName: "host") pod "2eed0200-c615-4229-9111-17b49ef48812" (UID: "2eed0200-c615-4229-9111-17b49ef48812"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:41:19 crc kubenswrapper[4845]: I1006 07:41:19.022747 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eed0200-c615-4229-9111-17b49ef48812-kube-api-access-5c6qk" (OuterVolumeSpecName: "kube-api-access-5c6qk") pod "2eed0200-c615-4229-9111-17b49ef48812" (UID: "2eed0200-c615-4229-9111-17b49ef48812"). InnerVolumeSpecName "kube-api-access-5c6qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:41:19 crc kubenswrapper[4845]: I1006 07:41:19.116092 4845 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2eed0200-c615-4229-9111-17b49ef48812-host\") on node \"crc\" DevicePath \"\"" Oct 06 07:41:19 crc kubenswrapper[4845]: I1006 07:41:19.116118 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c6qk\" (UniqueName: \"kubernetes.io/projected/2eed0200-c615-4229-9111-17b49ef48812-kube-api-access-5c6qk\") on node \"crc\" DevicePath \"\"" Oct 06 07:41:19 crc kubenswrapper[4845]: I1006 07:41:19.829583 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" event={"ID":"2eed0200-c615-4229-9111-17b49ef48812","Type":"ContainerDied","Data":"a17b6072bdd9fa8f225e42d1c315a8e16d71402b2a425d56c9a33621ac5bb62c"} Oct 06 07:41:19 crc kubenswrapper[4845]: I1006 07:41:19.829954 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a17b6072bdd9fa8f225e42d1c315a8e16d71402b2a425d56c9a33621ac5bb62c" Oct 06 07:41:19 crc kubenswrapper[4845]: I1006 07:41:19.829636 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/crc-debug-cx2s7" Oct 06 07:41:23 crc kubenswrapper[4845]: I1006 07:41:23.018425 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:41:23 crc kubenswrapper[4845]: I1006 07:41:23.019004 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:41:23 crc kubenswrapper[4845]: I1006 07:41:23.346074 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xcvm4/crc-debug-cx2s7"] Oct 06 07:41:23 crc kubenswrapper[4845]: I1006 07:41:23.354588 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xcvm4/crc-debug-cx2s7"] Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.241281 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eed0200-c615-4229-9111-17b49ef48812" path="/var/lib/kubelet/pods/2eed0200-c615-4229-9111-17b49ef48812/volumes" Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.501982 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xcvm4/crc-debug-thvgb"] Oct 06 07:41:24 crc kubenswrapper[4845]: E1006 07:41:24.502413 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eed0200-c615-4229-9111-17b49ef48812" containerName="container-00" Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.502433 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eed0200-c615-4229-9111-17b49ef48812" containerName="container-00" Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.502627 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eed0200-c615-4229-9111-17b49ef48812" containerName="container-00" Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.503220 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/crc-debug-thvgb" Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.604149 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qx54\" (UniqueName: \"kubernetes.io/projected/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b-kube-api-access-5qx54\") pod \"crc-debug-thvgb\" (UID: \"9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b\") " pod="openshift-must-gather-xcvm4/crc-debug-thvgb" Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.604248 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b-host\") pod \"crc-debug-thvgb\" (UID: \"9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b\") " pod="openshift-must-gather-xcvm4/crc-debug-thvgb" Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.706360 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qx54\" (UniqueName: \"kubernetes.io/projected/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b-kube-api-access-5qx54\") pod \"crc-debug-thvgb\" (UID: \"9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b\") " pod="openshift-must-gather-xcvm4/crc-debug-thvgb" Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.706638 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b-host\") pod \"crc-debug-thvgb\" (UID: \"9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b\") " pod="openshift-must-gather-xcvm4/crc-debug-thvgb" Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.706747 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b-host\") pod \"crc-debug-thvgb\" (UID: \"9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b\") " pod="openshift-must-gather-xcvm4/crc-debug-thvgb" Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.724784 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qx54\" (UniqueName: \"kubernetes.io/projected/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b-kube-api-access-5qx54\") pod \"crc-debug-thvgb\" (UID: \"9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b\") " pod="openshift-must-gather-xcvm4/crc-debug-thvgb" Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.831181 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/crc-debug-thvgb" Oct 06 07:41:24 crc kubenswrapper[4845]: I1006 07:41:24.883313 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/crc-debug-thvgb" event={"ID":"9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b","Type":"ContainerStarted","Data":"898d0d2c304534b4daf86b2910a9a6d6b40c8847aac51c0c82187dd81e15f3ab"} Oct 06 07:41:25 crc kubenswrapper[4845]: I1006 07:41:25.895993 4845 generic.go:334] "Generic (PLEG): container finished" podID="9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b" containerID="e626342a452f65f8711fe2b1f8994bd6a9fbbe448dec377c08fabf7ee1f1d2b2" exitCode=0 Oct 06 07:41:25 crc kubenswrapper[4845]: I1006 07:41:25.896086 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/crc-debug-thvgb" event={"ID":"9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b","Type":"ContainerDied","Data":"e626342a452f65f8711fe2b1f8994bd6a9fbbe448dec377c08fabf7ee1f1d2b2"} Oct 06 07:41:25 crc kubenswrapper[4845]: I1006 07:41:25.938442 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xcvm4/crc-debug-thvgb"] Oct 06 07:41:25 crc kubenswrapper[4845]: I1006 07:41:25.945884 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xcvm4/crc-debug-thvgb"] Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.027074 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/crc-debug-thvgb" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.151462 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qx54\" (UniqueName: \"kubernetes.io/projected/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b-kube-api-access-5qx54\") pod \"9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b\" (UID: \"9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b\") " Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.151644 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b-host\") pod \"9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b\" (UID: \"9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b\") " Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.151886 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b-host" (OuterVolumeSpecName: "host") pod "9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b" (UID: "9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.152204 4845 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b-host\") on node \"crc\" DevicePath \"\"" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.157635 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b-kube-api-access-5qx54" (OuterVolumeSpecName: "kube-api-access-5qx54") pod "9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b" (UID: "9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b"). InnerVolumeSpecName "kube-api-access-5qx54". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.253562 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qx54\" (UniqueName: \"kubernetes.io/projected/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b-kube-api-access-5qx54\") on node \"crc\" DevicePath \"\"" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.427129 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/util/0.log" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.625242 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/pull/0.log" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.652885 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/util/0.log" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.654073 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/pull/0.log" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.809623 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/extract/0.log" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.814604 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/util/0.log" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.815186 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/pull/0.log" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.915339 4845 scope.go:117] "RemoveContainer" containerID="e626342a452f65f8711fe2b1f8994bd6a9fbbe448dec377c08fabf7ee1f1d2b2" Oct 06 07:41:27 crc kubenswrapper[4845]: I1006 07:41:27.915393 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/crc-debug-thvgb" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.004758 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5b974f6766-qzbsb_2e4d8467-600b-4ae3-8b1a-c4f0416718f7/kube-rbac-proxy/0.log" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.090636 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5b974f6766-qzbsb_2e4d8467-600b-4ae3-8b1a-c4f0416718f7/manager/0.log" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.109428 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-84bd8f6848-mvrqc_4629df36-951c-461d-9b54-c69cbec8bcd5/kube-rbac-proxy/0.log" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.212531 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-84bd8f6848-mvrqc_4629df36-951c-461d-9b54-c69cbec8bcd5/manager/0.log" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.249761 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b" path="/var/lib/kubelet/pods/9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b/volumes" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.265722 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-hf7mm_0a59590c-5261-403e-a7e3-0e726a025412/kube-rbac-proxy/0.log" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.303715 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-hf7mm_0a59590c-5261-403e-a7e3-0e726a025412/manager/0.log" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.424187 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-698456cdc6-kdbk5_540c7899-a612-4745-8c42-02033d088f73/kube-rbac-proxy/0.log" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.500066 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-698456cdc6-kdbk5_540c7899-a612-4745-8c42-02033d088f73/manager/0.log" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.621154 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5c497dbdb-nw58b_5166cd91-6a8c-4b81-b311-2a1e561928d3/kube-rbac-proxy/0.log" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.645101 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5c497dbdb-nw58b_5166cd91-6a8c-4b81-b311-2a1e561928d3/manager/0.log" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.696258 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6675647785-4xx5n_c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd/kube-rbac-proxy/0.log" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.820085 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6675647785-4xx5n_c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd/manager/0.log" Oct 06 07:41:28 crc kubenswrapper[4845]: I1006 07:41:28.859610 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84788b6bc5-s8lqw_3f91fed8-7759-443b-869a-886f63b42502/kube-rbac-proxy/0.log" Oct 06 07:41:29 crc kubenswrapper[4845]: I1006 07:41:29.034442 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84788b6bc5-s8lqw_3f91fed8-7759-443b-869a-886f63b42502/manager/0.log" Oct 06 07:41:29 crc kubenswrapper[4845]: I1006 07:41:29.035658 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f5894c49f-lx8hg_01476e52-bab1-4b3b-b2d9-3a9d9469c943/kube-rbac-proxy/0.log" Oct 06 07:41:29 crc kubenswrapper[4845]: I1006 07:41:29.096389 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f5894c49f-lx8hg_01476e52-bab1-4b3b-b2d9-3a9d9469c943/manager/0.log" Oct 06 07:41:29 crc kubenswrapper[4845]: I1006 07:41:29.236196 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-57c9cdcf57-tf759_09e0aaab-0038-4f9b-881e-3774781f2825/kube-rbac-proxy/0.log" Oct 06 07:41:29 crc kubenswrapper[4845]: I1006 07:41:29.281024 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-57c9cdcf57-tf759_09e0aaab-0038-4f9b-881e-3774781f2825/manager/0.log" Oct 06 07:41:29 crc kubenswrapper[4845]: I1006 07:41:29.384636 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cb48dbc-6vhjt_12269390-1665-4934-8695-eab596535e81/kube-rbac-proxy/0.log" Oct 06 07:41:29 crc kubenswrapper[4845]: I1006 07:41:29.419629 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cb48dbc-6vhjt_12269390-1665-4934-8695-eab596535e81/manager/0.log" Oct 06 07:41:29 crc kubenswrapper[4845]: I1006 07:41:29.485300 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-d6c9dc5bc-wghng_223d7355-4741-4d59-b6cf-e71702ddc20e/kube-rbac-proxy/0.log" Oct 06 07:41:29 crc kubenswrapper[4845]: I1006 07:41:29.668048 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-d6c9dc5bc-wghng_223d7355-4741-4d59-b6cf-e71702ddc20e/manager/0.log" Oct 06 07:41:29 crc kubenswrapper[4845]: I1006 07:41:29.722337 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-69b956fbf6-c4htj_49f6ba5b-4750-418f-ac64-574d92bf6f61/kube-rbac-proxy/0.log" Oct 06 07:41:29 crc kubenswrapper[4845]: I1006 07:41:29.829834 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-69b956fbf6-c4htj_49f6ba5b-4750-418f-ac64-574d92bf6f61/manager/0.log" Oct 06 07:41:30 crc kubenswrapper[4845]: I1006 07:41:30.071762 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6c9b57c67-wrcqz_93be683f-25a1-477e-b676-5bc7be2c3bf8/kube-rbac-proxy/0.log" Oct 06 07:41:30 crc kubenswrapper[4845]: I1006 07:41:30.092677 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6c9b57c67-wrcqz_93be683f-25a1-477e-b676-5bc7be2c3bf8/manager/0.log" Oct 06 07:41:30 crc kubenswrapper[4845]: I1006 07:41:30.268472 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f59f9d8-t66dv_58567cef-7ff1-455c-a1d3-1f7a6f35a504/kube-rbac-proxy/0.log" Oct 06 07:41:30 crc kubenswrapper[4845]: I1006 07:41:30.359579 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f59f9d8-t66dv_58567cef-7ff1-455c-a1d3-1f7a6f35a504/manager/0.log" Oct 06 07:41:30 crc kubenswrapper[4845]: I1006 07:41:30.420567 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-66cc85b5d582qhb_1877f632-ca26-4045-a192-08d2f0f97a4e/kube-rbac-proxy/0.log" Oct 06 07:41:30 crc kubenswrapper[4845]: I1006 07:41:30.457694 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-66cc85b5d582qhb_1877f632-ca26-4045-a192-08d2f0f97a4e/manager/0.log" Oct 06 07:41:30 crc kubenswrapper[4845]: I1006 07:41:30.568846 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cfc658b9-dlvb8_05de66df-c97e-40d4-a605-188b3d8e66eb/kube-rbac-proxy/0.log" Oct 06 07:41:30 crc kubenswrapper[4845]: I1006 07:41:30.815123 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-677d5bb784-zdlwp_f8c7a5a4-11d3-4d38-95d5-fb90e97378b9/kube-rbac-proxy/0.log" Oct 06 07:41:30 crc kubenswrapper[4845]: I1006 07:41:30.968891 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-677d5bb784-zdlwp_f8c7a5a4-11d3-4d38-95d5-fb90e97378b9/operator/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.064650 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-k8dzl_6cbeaf69-7c1e-472a-8383-209ac778658e/registry-server/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.108580 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-c968bb45-h4m5g_8342c3c4-6f77-4647-a4c2-9f834c55ee19/kube-rbac-proxy/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.263128 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-c968bb45-h4m5g_8342c3c4-6f77-4647-a4c2-9f834c55ee19/manager/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.353608 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-66f6d6849b-5qmws_c21520ff-b41e-4433-8656-d248f0975c60/kube-rbac-proxy/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.376829 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-66f6d6849b-5qmws_c21520ff-b41e-4433-8656-d248f0975c60/manager/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.539287 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw_880d791a-bcb7-4f71-8a16-015bd26af4d9/operator/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.621154 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-9v85j_38e8455f-f063-46aa-8275-20b6d80aa9ea/kube-rbac-proxy/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.676393 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cfc658b9-dlvb8_05de66df-c97e-40d4-a605-188b3d8e66eb/manager/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.732318 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-9v85j_38e8455f-f063-46aa-8275-20b6d80aa9ea/manager/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.814457 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f589c7597-c2mlw_e3721fe9-cb88-4326-aa90-2d08b909515e/kube-rbac-proxy/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.861778 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f589c7597-c2mlw_e3721fe9-cb88-4326-aa90-2d08b909515e/manager/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.945169 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-nsmkl_c921a4a9-e09a-4fd2-965e-c13f7fee169e/kube-rbac-proxy/0.log" Oct 06 07:41:31 crc kubenswrapper[4845]: I1006 07:41:31.982366 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-nsmkl_c921a4a9-e09a-4fd2-965e-c13f7fee169e/manager/0.log" Oct 06 07:41:32 crc kubenswrapper[4845]: I1006 07:41:32.045061 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d98cc5575-mj4wz_dcb43cda-fbc6-4092-bf5d-296858e233cd/kube-rbac-proxy/0.log" Oct 06 07:41:32 crc kubenswrapper[4845]: I1006 07:41:32.046223 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d98cc5575-mj4wz_dcb43cda-fbc6-4092-bf5d-296858e233cd/manager/0.log" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.570517 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2smdl"] Oct 06 07:41:39 crc kubenswrapper[4845]: E1006 07:41:39.572838 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b" containerName="container-00" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.572880 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b" containerName="container-00" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.573114 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d4ff932-01d7-4a9a-95c1-cf3d8c57dd3b" containerName="container-00" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.574924 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.582342 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2smdl"] Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.669063 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnwm4\" (UniqueName: \"kubernetes.io/projected/fada0329-244a-449f-bb74-2ee74eaba095-kube-api-access-pnwm4\") pod \"community-operators-2smdl\" (UID: \"fada0329-244a-449f-bb74-2ee74eaba095\") " pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.669118 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fada0329-244a-449f-bb74-2ee74eaba095-utilities\") pod \"community-operators-2smdl\" (UID: \"fada0329-244a-449f-bb74-2ee74eaba095\") " pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.669253 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fada0329-244a-449f-bb74-2ee74eaba095-catalog-content\") pod \"community-operators-2smdl\" (UID: \"fada0329-244a-449f-bb74-2ee74eaba095\") " pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.771307 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fada0329-244a-449f-bb74-2ee74eaba095-utilities\") pod \"community-operators-2smdl\" (UID: \"fada0329-244a-449f-bb74-2ee74eaba095\") " pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.771672 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnwm4\" (UniqueName: \"kubernetes.io/projected/fada0329-244a-449f-bb74-2ee74eaba095-kube-api-access-pnwm4\") pod \"community-operators-2smdl\" (UID: \"fada0329-244a-449f-bb74-2ee74eaba095\") " pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.771756 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fada0329-244a-449f-bb74-2ee74eaba095-catalog-content\") pod \"community-operators-2smdl\" (UID: \"fada0329-244a-449f-bb74-2ee74eaba095\") " pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.771809 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fada0329-244a-449f-bb74-2ee74eaba095-utilities\") pod \"community-operators-2smdl\" (UID: \"fada0329-244a-449f-bb74-2ee74eaba095\") " pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.772243 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fada0329-244a-449f-bb74-2ee74eaba095-catalog-content\") pod \"community-operators-2smdl\" (UID: \"fada0329-244a-449f-bb74-2ee74eaba095\") " pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.790568 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnwm4\" (UniqueName: \"kubernetes.io/projected/fada0329-244a-449f-bb74-2ee74eaba095-kube-api-access-pnwm4\") pod \"community-operators-2smdl\" (UID: \"fada0329-244a-449f-bb74-2ee74eaba095\") " pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:39 crc kubenswrapper[4845]: I1006 07:41:39.895436 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:40 crc kubenswrapper[4845]: I1006 07:41:40.335247 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2smdl"] Oct 06 07:41:41 crc kubenswrapper[4845]: I1006 07:41:41.047702 4845 generic.go:334] "Generic (PLEG): container finished" podID="fada0329-244a-449f-bb74-2ee74eaba095" containerID="6d907c35f83728301f39381f0471986e7d9f80fd1d23ae25afae3c0ee885e811" exitCode=0 Oct 06 07:41:41 crc kubenswrapper[4845]: I1006 07:41:41.047748 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2smdl" event={"ID":"fada0329-244a-449f-bb74-2ee74eaba095","Type":"ContainerDied","Data":"6d907c35f83728301f39381f0471986e7d9f80fd1d23ae25afae3c0ee885e811"} Oct 06 07:41:41 crc kubenswrapper[4845]: I1006 07:41:41.047776 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2smdl" event={"ID":"fada0329-244a-449f-bb74-2ee74eaba095","Type":"ContainerStarted","Data":"fdc8c5e46d2eb519ba27b3f16741b0cc290d5e34060ee82d115b0ded703e3861"} Oct 06 07:41:41 crc kubenswrapper[4845]: I1006 07:41:41.049955 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 07:41:45 crc kubenswrapper[4845]: I1006 07:41:45.090511 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2smdl" event={"ID":"fada0329-244a-449f-bb74-2ee74eaba095","Type":"ContainerStarted","Data":"47113cba4c85801408ae0383de9743a9c9b82b8634154be94956e0d80ace7375"} Oct 06 07:41:46 crc kubenswrapper[4845]: I1006 07:41:46.102221 4845 generic.go:334] "Generic (PLEG): container finished" podID="fada0329-244a-449f-bb74-2ee74eaba095" containerID="47113cba4c85801408ae0383de9743a9c9b82b8634154be94956e0d80ace7375" exitCode=0 Oct 06 07:41:46 crc kubenswrapper[4845]: I1006 07:41:46.102264 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2smdl" event={"ID":"fada0329-244a-449f-bb74-2ee74eaba095","Type":"ContainerDied","Data":"47113cba4c85801408ae0383de9743a9c9b82b8634154be94956e0d80ace7375"} Oct 06 07:41:46 crc kubenswrapper[4845]: I1006 07:41:46.292641 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-g4k48_38106f1d-b5d4-4d89-b79b-2a6173fe76c2/control-plane-machine-set-operator/0.log" Oct 06 07:41:46 crc kubenswrapper[4845]: I1006 07:41:46.471945 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-v6457_c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e/kube-rbac-proxy/0.log" Oct 06 07:41:46 crc kubenswrapper[4845]: I1006 07:41:46.503995 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-v6457_c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e/machine-api-operator/0.log" Oct 06 07:41:47 crc kubenswrapper[4845]: I1006 07:41:47.112858 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2smdl" event={"ID":"fada0329-244a-449f-bb74-2ee74eaba095","Type":"ContainerStarted","Data":"c7b7e434e7333e836794adf9af76dfd6b6bcd41d242e371c40d3603835196bff"} Oct 06 07:41:47 crc kubenswrapper[4845]: I1006 07:41:47.145712 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2smdl" podStartSLOduration=2.685331983 podStartE2EDuration="8.145688105s" podCreationTimestamp="2025-10-06 07:41:39 +0000 UTC" firstStartedPulling="2025-10-06 07:41:41.049627852 +0000 UTC m=+3385.564368860" lastFinishedPulling="2025-10-06 07:41:46.509983974 +0000 UTC m=+3391.024724982" observedRunningTime="2025-10-06 07:41:47.142286689 +0000 UTC m=+3391.657027717" watchObservedRunningTime="2025-10-06 07:41:47.145688105 +0000 UTC m=+3391.660429123" Oct 06 07:41:49 crc kubenswrapper[4845]: I1006 07:41:49.896515 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:49 crc kubenswrapper[4845]: I1006 07:41:49.896998 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:49 crc kubenswrapper[4845]: I1006 07:41:49.968418 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:53 crc kubenswrapper[4845]: I1006 07:41:53.019578 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:41:53 crc kubenswrapper[4845]: I1006 07:41:53.019911 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:41:53 crc kubenswrapper[4845]: I1006 07:41:53.019955 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 07:41:53 crc kubenswrapper[4845]: I1006 07:41:53.020627 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d44e32c11f635c2d507d4f8ff409ad552dde0d564827032df278547fcdec354d"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 07:41:53 crc kubenswrapper[4845]: I1006 07:41:53.020690 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://d44e32c11f635c2d507d4f8ff409ad552dde0d564827032df278547fcdec354d" gracePeriod=600 Oct 06 07:41:53 crc kubenswrapper[4845]: I1006 07:41:53.166533 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="d44e32c11f635c2d507d4f8ff409ad552dde0d564827032df278547fcdec354d" exitCode=0 Oct 06 07:41:53 crc kubenswrapper[4845]: I1006 07:41:53.166942 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"d44e32c11f635c2d507d4f8ff409ad552dde0d564827032df278547fcdec354d"} Oct 06 07:41:53 crc kubenswrapper[4845]: I1006 07:41:53.167060 4845 scope.go:117] "RemoveContainer" containerID="4fd36bb4d8b663218e6e925cbc21644d476f122fac8b4f357cc8f0aa04e829f9" Oct 06 07:41:54 crc kubenswrapper[4845]: I1006 07:41:54.176554 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d"} Oct 06 07:41:57 crc kubenswrapper[4845]: I1006 07:41:57.448572 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-s9n2l_d6f5fa25-c3ab-47a0-8541-dfdf1ed2d29e/cert-manager-controller/0.log" Oct 06 07:41:57 crc kubenswrapper[4845]: I1006 07:41:57.603137 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-g57wq_a2715e14-d1ac-4227-b55e-ad1207b34e92/cert-manager-cainjector/0.log" Oct 06 07:41:57 crc kubenswrapper[4845]: I1006 07:41:57.671011 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-jtnq6_f555f420-64f8-46d7-a41a-d24e3257aea5/cert-manager-webhook/0.log" Oct 06 07:41:59 crc kubenswrapper[4845]: I1006 07:41:59.939408 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2smdl" Oct 06 07:41:59 crc kubenswrapper[4845]: I1006 07:41:59.994624 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2smdl"] Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.083908 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pz4fr"] Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.084277 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pz4fr" podUID="6f51c480-861a-4d08-86e9-d17a340815d0" containerName="registry-server" containerID="cri-o://4ee282a388a25da56522a73385055072bdb99891e7c6148b7dd3dfcb5045c55c" gracePeriod=2 Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.252765 4845 generic.go:334] "Generic (PLEG): container finished" podID="6f51c480-861a-4d08-86e9-d17a340815d0" containerID="4ee282a388a25da56522a73385055072bdb99891e7c6148b7dd3dfcb5045c55c" exitCode=0 Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.253013 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pz4fr" event={"ID":"6f51c480-861a-4d08-86e9-d17a340815d0","Type":"ContainerDied","Data":"4ee282a388a25da56522a73385055072bdb99891e7c6148b7dd3dfcb5045c55c"} Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.604836 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.716710 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6hkz\" (UniqueName: \"kubernetes.io/projected/6f51c480-861a-4d08-86e9-d17a340815d0-kube-api-access-v6hkz\") pod \"6f51c480-861a-4d08-86e9-d17a340815d0\" (UID: \"6f51c480-861a-4d08-86e9-d17a340815d0\") " Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.717160 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f51c480-861a-4d08-86e9-d17a340815d0-utilities\") pod \"6f51c480-861a-4d08-86e9-d17a340815d0\" (UID: \"6f51c480-861a-4d08-86e9-d17a340815d0\") " Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.717271 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f51c480-861a-4d08-86e9-d17a340815d0-catalog-content\") pod \"6f51c480-861a-4d08-86e9-d17a340815d0\" (UID: \"6f51c480-861a-4d08-86e9-d17a340815d0\") " Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.717916 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f51c480-861a-4d08-86e9-d17a340815d0-utilities" (OuterVolumeSpecName: "utilities") pod "6f51c480-861a-4d08-86e9-d17a340815d0" (UID: "6f51c480-861a-4d08-86e9-d17a340815d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.722602 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f51c480-861a-4d08-86e9-d17a340815d0-kube-api-access-v6hkz" (OuterVolumeSpecName: "kube-api-access-v6hkz") pod "6f51c480-861a-4d08-86e9-d17a340815d0" (UID: "6f51c480-861a-4d08-86e9-d17a340815d0"). InnerVolumeSpecName "kube-api-access-v6hkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.768635 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f51c480-861a-4d08-86e9-d17a340815d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f51c480-861a-4d08-86e9-d17a340815d0" (UID: "6f51c480-861a-4d08-86e9-d17a340815d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.819073 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6hkz\" (UniqueName: \"kubernetes.io/projected/6f51c480-861a-4d08-86e9-d17a340815d0-kube-api-access-v6hkz\") on node \"crc\" DevicePath \"\"" Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.819108 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f51c480-861a-4d08-86e9-d17a340815d0-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:42:00 crc kubenswrapper[4845]: I1006 07:42:00.819118 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f51c480-861a-4d08-86e9-d17a340815d0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:42:01 crc kubenswrapper[4845]: I1006 07:42:01.262728 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pz4fr" event={"ID":"6f51c480-861a-4d08-86e9-d17a340815d0","Type":"ContainerDied","Data":"3a4d90861559364134014d072023bc95c75e31d364273ccac7076c5de4794203"} Oct 06 07:42:01 crc kubenswrapper[4845]: I1006 07:42:01.262764 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pz4fr" Oct 06 07:42:01 crc kubenswrapper[4845]: I1006 07:42:01.262786 4845 scope.go:117] "RemoveContainer" containerID="4ee282a388a25da56522a73385055072bdb99891e7c6148b7dd3dfcb5045c55c" Oct 06 07:42:01 crc kubenswrapper[4845]: I1006 07:42:01.292904 4845 scope.go:117] "RemoveContainer" containerID="eb0771592ccce355faa506091d5ccf0f9d0564df907e38027c9ffc664c06228b" Oct 06 07:42:01 crc kubenswrapper[4845]: I1006 07:42:01.298488 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pz4fr"] Oct 06 07:42:01 crc kubenswrapper[4845]: I1006 07:42:01.312993 4845 scope.go:117] "RemoveContainer" containerID="47a8dd772d5234e435e70786d01ad699f1fc126182860ac0be2319798b86d55a" Oct 06 07:42:01 crc kubenswrapper[4845]: I1006 07:42:01.314760 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pz4fr"] Oct 06 07:42:02 crc kubenswrapper[4845]: I1006 07:42:02.237759 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f51c480-861a-4d08-86e9-d17a340815d0" path="/var/lib/kubelet/pods/6f51c480-861a-4d08-86e9-d17a340815d0/volumes" Oct 06 07:42:08 crc kubenswrapper[4845]: I1006 07:42:08.903605 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-grzq8_e39da7c7-c03a-458b-9f86-ef7a914b900a/nmstate-console-plugin/0.log" Oct 06 07:42:09 crc kubenswrapper[4845]: I1006 07:42:09.060049 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bs789_a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b/nmstate-handler/0.log" Oct 06 07:42:09 crc kubenswrapper[4845]: I1006 07:42:09.100821 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-qsw2n_f7a8f638-cc48-4cca-965a-c3d16476963c/nmstate-metrics/0.log" Oct 06 07:42:09 crc kubenswrapper[4845]: I1006 07:42:09.104431 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-qsw2n_f7a8f638-cc48-4cca-965a-c3d16476963c/kube-rbac-proxy/0.log" Oct 06 07:42:09 crc kubenswrapper[4845]: I1006 07:42:09.282771 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-8lmwp_55d7abcf-08e7-4529-8c8c-1f45ab1ea688/nmstate-operator/0.log" Oct 06 07:42:09 crc kubenswrapper[4845]: I1006 07:42:09.335236 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-s52dg_9d129de3-ab21-48ea-a89c-0c59574eb288/nmstate-webhook/0.log" Oct 06 07:42:22 crc kubenswrapper[4845]: I1006 07:42:22.445672 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-xbpss_96184c2a-b2b5-4dec-b93b-a875c0a07930/kube-rbac-proxy/0.log" Oct 06 07:42:22 crc kubenswrapper[4845]: I1006 07:42:22.512554 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-xbpss_96184c2a-b2b5-4dec-b93b-a875c0a07930/controller/0.log" Oct 06 07:42:22 crc kubenswrapper[4845]: I1006 07:42:22.629645 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-frr-files/0.log" Oct 06 07:42:22 crc kubenswrapper[4845]: I1006 07:42:22.810458 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-frr-files/0.log" Oct 06 07:42:22 crc kubenswrapper[4845]: I1006 07:42:22.813913 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-metrics/0.log" Oct 06 07:42:22 crc kubenswrapper[4845]: I1006 07:42:22.816922 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-reloader/0.log" Oct 06 07:42:22 crc kubenswrapper[4845]: I1006 07:42:22.841851 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-reloader/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.074288 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-frr-files/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.085120 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-metrics/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.094706 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-reloader/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.111156 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-metrics/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.252129 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-reloader/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.267513 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-metrics/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.279699 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/controller/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.313516 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-frr-files/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.465657 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/frr-metrics/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.465707 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/kube-rbac-proxy/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.519103 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/kube-rbac-proxy-frr/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.644765 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/reloader/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.743883 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-kctns_396607a8-6c19-447d-a4a2-7a8ef92d957a/frr-k8s-webhook-server/0.log" Oct 06 07:42:23 crc kubenswrapper[4845]: I1006 07:42:23.935582 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-687d5696cb-xscrv_5501cc41-c16f-423f-a782-96e0d186f44e/manager/0.log" Oct 06 07:42:24 crc kubenswrapper[4845]: I1006 07:42:24.112294 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7c77dc94b4-h9wms_f925eeb5-fe5d-4479-9c06-be3069abc88d/webhook-server/0.log" Oct 06 07:42:24 crc kubenswrapper[4845]: I1006 07:42:24.233242 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tsmdg_fcf3c696-ca9a-4b92-9314-dcc4edb86577/kube-rbac-proxy/0.log" Oct 06 07:42:24 crc kubenswrapper[4845]: I1006 07:42:24.768586 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tsmdg_fcf3c696-ca9a-4b92-9314-dcc4edb86577/speaker/0.log" Oct 06 07:42:24 crc kubenswrapper[4845]: I1006 07:42:24.781754 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/frr/0.log" Oct 06 07:42:35 crc kubenswrapper[4845]: I1006 07:42:35.382046 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/util/0.log" Oct 06 07:42:35 crc kubenswrapper[4845]: I1006 07:42:35.551861 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/util/0.log" Oct 06 07:42:35 crc kubenswrapper[4845]: I1006 07:42:35.562939 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/pull/0.log" Oct 06 07:42:35 crc kubenswrapper[4845]: I1006 07:42:35.650867 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/pull/0.log" Oct 06 07:42:35 crc kubenswrapper[4845]: I1006 07:42:35.750817 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/util/0.log" Oct 06 07:42:35 crc kubenswrapper[4845]: I1006 07:42:35.751759 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/pull/0.log" Oct 06 07:42:35 crc kubenswrapper[4845]: I1006 07:42:35.803699 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/extract/0.log" Oct 06 07:42:35 crc kubenswrapper[4845]: I1006 07:42:35.901231 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/extract-utilities/0.log" Oct 06 07:42:36 crc kubenswrapper[4845]: I1006 07:42:36.058648 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/extract-content/0.log" Oct 06 07:42:36 crc kubenswrapper[4845]: I1006 07:42:36.071158 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/extract-utilities/0.log" Oct 06 07:42:36 crc kubenswrapper[4845]: I1006 07:42:36.080892 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/extract-content/0.log" Oct 06 07:42:36 crc kubenswrapper[4845]: I1006 07:42:36.234031 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/extract-utilities/0.log" Oct 06 07:42:36 crc kubenswrapper[4845]: I1006 07:42:36.255364 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/extract-content/0.log" Oct 06 07:42:36 crc kubenswrapper[4845]: I1006 07:42:36.437565 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/extract-utilities/0.log" Oct 06 07:42:36 crc kubenswrapper[4845]: I1006 07:42:36.620684 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/extract-utilities/0.log" Oct 06 07:42:36 crc kubenswrapper[4845]: I1006 07:42:36.646843 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/registry-server/0.log" Oct 06 07:42:36 crc kubenswrapper[4845]: I1006 07:42:36.663293 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/extract-content/0.log" Oct 06 07:42:36 crc kubenswrapper[4845]: I1006 07:42:36.698039 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/extract-content/0.log" Oct 06 07:42:36 crc kubenswrapper[4845]: I1006 07:42:36.866387 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/extract-utilities/0.log" Oct 06 07:42:36 crc kubenswrapper[4845]: I1006 07:42:36.928024 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/extract-content/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.029504 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/registry-server/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.081362 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/util/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.203277 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/util/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.204386 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/pull/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.235764 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/pull/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.402622 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/pull/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.403518 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/extract/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.418580 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/util/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.568235 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zxglv_b5f2f19d-7dbf-4265-8e4c-96739b00f6e2/marketplace-operator/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.579876 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/extract-utilities/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.753665 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/extract-utilities/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.766234 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/extract-content/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.768747 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/extract-content/0.log" Oct 06 07:42:37 crc kubenswrapper[4845]: I1006 07:42:37.998360 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/extract-utilities/0.log" Oct 06 07:42:38 crc kubenswrapper[4845]: I1006 07:42:38.012290 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/extract-content/0.log" Oct 06 07:42:38 crc kubenswrapper[4845]: I1006 07:42:38.022593 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/registry-server/0.log" Oct 06 07:42:38 crc kubenswrapper[4845]: I1006 07:42:38.149145 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/extract-utilities/0.log" Oct 06 07:42:38 crc kubenswrapper[4845]: I1006 07:42:38.285691 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/extract-utilities/0.log" Oct 06 07:42:38 crc kubenswrapper[4845]: I1006 07:42:38.310543 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/extract-content/0.log" Oct 06 07:42:38 crc kubenswrapper[4845]: I1006 07:42:38.319482 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/extract-content/0.log" Oct 06 07:42:38 crc kubenswrapper[4845]: I1006 07:42:38.469421 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/extract-utilities/0.log" Oct 06 07:42:38 crc kubenswrapper[4845]: I1006 07:42:38.505614 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/extract-content/0.log" Oct 06 07:42:38 crc kubenswrapper[4845]: I1006 07:42:38.921032 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/registry-server/0.log" Oct 06 07:43:53 crc kubenswrapper[4845]: I1006 07:43:53.018904 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:43:53 crc kubenswrapper[4845]: I1006 07:43:53.019513 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:44:23 crc kubenswrapper[4845]: I1006 07:44:23.018979 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:44:23 crc kubenswrapper[4845]: I1006 07:44:23.019736 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:44:26 crc kubenswrapper[4845]: I1006 07:44:26.542787 4845 generic.go:334] "Generic (PLEG): container finished" podID="c57034cd-1f95-4d62-9877-ef39b4ca9e86" containerID="f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035" exitCode=0 Oct 06 07:44:26 crc kubenswrapper[4845]: I1006 07:44:26.542827 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcvm4/must-gather-5rnl2" event={"ID":"c57034cd-1f95-4d62-9877-ef39b4ca9e86","Type":"ContainerDied","Data":"f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035"} Oct 06 07:44:26 crc kubenswrapper[4845]: I1006 07:44:26.543486 4845 scope.go:117] "RemoveContainer" containerID="f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035" Oct 06 07:44:27 crc kubenswrapper[4845]: I1006 07:44:27.480961 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xcvm4_must-gather-5rnl2_c57034cd-1f95-4d62-9877-ef39b4ca9e86/gather/0.log" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.062964 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xcvm4/must-gather-5rnl2"] Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.063815 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-xcvm4/must-gather-5rnl2" podUID="c57034cd-1f95-4d62-9877-ef39b4ca9e86" containerName="copy" containerID="cri-o://b3efd07ef8de7479182c74741171acb3d2bb590c32ad0ba373d36a3c140e29ec" gracePeriod=2 Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.071451 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xcvm4/must-gather-5rnl2"] Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.563351 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xcvm4_must-gather-5rnl2_c57034cd-1f95-4d62-9877-ef39b4ca9e86/copy/0.log" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.564143 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/must-gather-5rnl2" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.621110 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xcvm4_must-gather-5rnl2_c57034cd-1f95-4d62-9877-ef39b4ca9e86/copy/0.log" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.621513 4845 generic.go:334] "Generic (PLEG): container finished" podID="c57034cd-1f95-4d62-9877-ef39b4ca9e86" containerID="b3efd07ef8de7479182c74741171acb3d2bb590c32ad0ba373d36a3c140e29ec" exitCode=143 Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.621562 4845 scope.go:117] "RemoveContainer" containerID="b3efd07ef8de7479182c74741171acb3d2bb590c32ad0ba373d36a3c140e29ec" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.621670 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcvm4/must-gather-5rnl2" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.639818 4845 scope.go:117] "RemoveContainer" containerID="f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.683179 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c57034cd-1f95-4d62-9877-ef39b4ca9e86-must-gather-output\") pod \"c57034cd-1f95-4d62-9877-ef39b4ca9e86\" (UID: \"c57034cd-1f95-4d62-9877-ef39b4ca9e86\") " Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.683264 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw249\" (UniqueName: \"kubernetes.io/projected/c57034cd-1f95-4d62-9877-ef39b4ca9e86-kube-api-access-tw249\") pod \"c57034cd-1f95-4d62-9877-ef39b4ca9e86\" (UID: \"c57034cd-1f95-4d62-9877-ef39b4ca9e86\") " Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.689653 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57034cd-1f95-4d62-9877-ef39b4ca9e86-kube-api-access-tw249" (OuterVolumeSpecName: "kube-api-access-tw249") pod "c57034cd-1f95-4d62-9877-ef39b4ca9e86" (UID: "c57034cd-1f95-4d62-9877-ef39b4ca9e86"). InnerVolumeSpecName "kube-api-access-tw249". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.718619 4845 scope.go:117] "RemoveContainer" containerID="b3efd07ef8de7479182c74741171acb3d2bb590c32ad0ba373d36a3c140e29ec" Oct 06 07:44:35 crc kubenswrapper[4845]: E1006 07:44:35.722505 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3efd07ef8de7479182c74741171acb3d2bb590c32ad0ba373d36a3c140e29ec\": container with ID starting with b3efd07ef8de7479182c74741171acb3d2bb590c32ad0ba373d36a3c140e29ec not found: ID does not exist" containerID="b3efd07ef8de7479182c74741171acb3d2bb590c32ad0ba373d36a3c140e29ec" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.722649 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3efd07ef8de7479182c74741171acb3d2bb590c32ad0ba373d36a3c140e29ec"} err="failed to get container status \"b3efd07ef8de7479182c74741171acb3d2bb590c32ad0ba373d36a3c140e29ec\": rpc error: code = NotFound desc = could not find container \"b3efd07ef8de7479182c74741171acb3d2bb590c32ad0ba373d36a3c140e29ec\": container with ID starting with b3efd07ef8de7479182c74741171acb3d2bb590c32ad0ba373d36a3c140e29ec not found: ID does not exist" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.722722 4845 scope.go:117] "RemoveContainer" containerID="f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035" Oct 06 07:44:35 crc kubenswrapper[4845]: E1006 07:44:35.723796 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035\": container with ID starting with f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035 not found: ID does not exist" containerID="f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.723837 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035"} err="failed to get container status \"f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035\": rpc error: code = NotFound desc = could not find container \"f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035\": container with ID starting with f5fe1e3a752cec4ac5cf2d3274099d223b91be96adc1250135e02314c4ebf035 not found: ID does not exist" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.795345 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw249\" (UniqueName: \"kubernetes.io/projected/c57034cd-1f95-4d62-9877-ef39b4ca9e86-kube-api-access-tw249\") on node \"crc\" DevicePath \"\"" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.879544 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c57034cd-1f95-4d62-9877-ef39b4ca9e86-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c57034cd-1f95-4d62-9877-ef39b4ca9e86" (UID: "c57034cd-1f95-4d62-9877-ef39b4ca9e86"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:44:35 crc kubenswrapper[4845]: I1006 07:44:35.897168 4845 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c57034cd-1f95-4d62-9877-ef39b4ca9e86-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 06 07:44:36 crc kubenswrapper[4845]: I1006 07:44:36.236998 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c57034cd-1f95-4d62-9877-ef39b4ca9e86" path="/var/lib/kubelet/pods/c57034cd-1f95-4d62-9877-ef39b4ca9e86/volumes" Oct 06 07:44:53 crc kubenswrapper[4845]: I1006 07:44:53.019521 4845 patch_prober.go:28] interesting pod/machine-config-daemon-tpgm6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 07:44:53 crc kubenswrapper[4845]: I1006 07:44:53.020274 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 07:44:53 crc kubenswrapper[4845]: I1006 07:44:53.020346 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" Oct 06 07:44:53 crc kubenswrapper[4845]: I1006 07:44:53.021323 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d"} pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 07:44:53 crc kubenswrapper[4845]: I1006 07:44:53.021441 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" containerName="machine-config-daemon" containerID="cri-o://b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" gracePeriod=600 Oct 06 07:44:53 crc kubenswrapper[4845]: E1006 07:44:53.138926 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:44:53 crc kubenswrapper[4845]: I1006 07:44:53.784441 4845 generic.go:334] "Generic (PLEG): container finished" podID="6936952c-09f0-48fd-8832-38c18202ae81" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" exitCode=0 Oct 06 07:44:53 crc kubenswrapper[4845]: I1006 07:44:53.784486 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerDied","Data":"b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d"} Oct 06 07:44:53 crc kubenswrapper[4845]: I1006 07:44:53.784881 4845 scope.go:117] "RemoveContainer" containerID="d44e32c11f635c2d507d4f8ff409ad552dde0d564827032df278547fcdec354d" Oct 06 07:44:53 crc kubenswrapper[4845]: I1006 07:44:53.785672 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:44:53 crc kubenswrapper[4845]: E1006 07:44:53.786194 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.180782 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz"] Oct 06 07:45:00 crc kubenswrapper[4845]: E1006 07:45:00.181842 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f51c480-861a-4d08-86e9-d17a340815d0" containerName="registry-server" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.181861 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f51c480-861a-4d08-86e9-d17a340815d0" containerName="registry-server" Oct 06 07:45:00 crc kubenswrapper[4845]: E1006 07:45:00.181890 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f51c480-861a-4d08-86e9-d17a340815d0" containerName="extract-content" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.181899 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f51c480-861a-4d08-86e9-d17a340815d0" containerName="extract-content" Oct 06 07:45:00 crc kubenswrapper[4845]: E1006 07:45:00.181914 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57034cd-1f95-4d62-9877-ef39b4ca9e86" containerName="copy" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.181922 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57034cd-1f95-4d62-9877-ef39b4ca9e86" containerName="copy" Oct 06 07:45:00 crc kubenswrapper[4845]: E1006 07:45:00.181968 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f51c480-861a-4d08-86e9-d17a340815d0" containerName="extract-utilities" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.181977 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f51c480-861a-4d08-86e9-d17a340815d0" containerName="extract-utilities" Oct 06 07:45:00 crc kubenswrapper[4845]: E1006 07:45:00.181998 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57034cd-1f95-4d62-9877-ef39b4ca9e86" containerName="gather" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.182006 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57034cd-1f95-4d62-9877-ef39b4ca9e86" containerName="gather" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.182242 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57034cd-1f95-4d62-9877-ef39b4ca9e86" containerName="copy" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.182265 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f51c480-861a-4d08-86e9-d17a340815d0" containerName="registry-server" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.182276 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57034cd-1f95-4d62-9877-ef39b4ca9e86" containerName="gather" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.183205 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.191864 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.191979 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.198889 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz"] Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.269767 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6bhl\" (UniqueName: \"kubernetes.io/projected/1b055342-d2d4-4903-8e94-cea6fe55e396-kube-api-access-x6bhl\") pod \"collect-profiles-29328945-k4txz\" (UID: \"1b055342-d2d4-4903-8e94-cea6fe55e396\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.269947 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b055342-d2d4-4903-8e94-cea6fe55e396-config-volume\") pod \"collect-profiles-29328945-k4txz\" (UID: \"1b055342-d2d4-4903-8e94-cea6fe55e396\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.270026 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b055342-d2d4-4903-8e94-cea6fe55e396-secret-volume\") pod \"collect-profiles-29328945-k4txz\" (UID: \"1b055342-d2d4-4903-8e94-cea6fe55e396\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.374809 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b055342-d2d4-4903-8e94-cea6fe55e396-config-volume\") pod \"collect-profiles-29328945-k4txz\" (UID: \"1b055342-d2d4-4903-8e94-cea6fe55e396\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.374886 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b055342-d2d4-4903-8e94-cea6fe55e396-secret-volume\") pod \"collect-profiles-29328945-k4txz\" (UID: \"1b055342-d2d4-4903-8e94-cea6fe55e396\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.374992 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6bhl\" (UniqueName: \"kubernetes.io/projected/1b055342-d2d4-4903-8e94-cea6fe55e396-kube-api-access-x6bhl\") pod \"collect-profiles-29328945-k4txz\" (UID: \"1b055342-d2d4-4903-8e94-cea6fe55e396\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.376185 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b055342-d2d4-4903-8e94-cea6fe55e396-config-volume\") pod \"collect-profiles-29328945-k4txz\" (UID: \"1b055342-d2d4-4903-8e94-cea6fe55e396\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.385842 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b055342-d2d4-4903-8e94-cea6fe55e396-secret-volume\") pod \"collect-profiles-29328945-k4txz\" (UID: \"1b055342-d2d4-4903-8e94-cea6fe55e396\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.397639 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6bhl\" (UniqueName: \"kubernetes.io/projected/1b055342-d2d4-4903-8e94-cea6fe55e396-kube-api-access-x6bhl\") pod \"collect-profiles-29328945-k4txz\" (UID: \"1b055342-d2d4-4903-8e94-cea6fe55e396\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.511281 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:00 crc kubenswrapper[4845]: I1006 07:45:00.956624 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz"] Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.422334 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rz8wr/must-gather-sft27"] Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.424065 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/must-gather-sft27" Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.428075 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rz8wr"/"kube-root-ca.crt" Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.430318 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rz8wr"/"openshift-service-ca.crt" Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.450997 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rz8wr/must-gather-sft27"] Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.495956 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzpbh\" (UniqueName: \"kubernetes.io/projected/1923757d-527d-4867-9c13-f732f7e10077-kube-api-access-pzpbh\") pod \"must-gather-sft27\" (UID: \"1923757d-527d-4867-9c13-f732f7e10077\") " pod="openshift-must-gather-rz8wr/must-gather-sft27" Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.496012 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1923757d-527d-4867-9c13-f732f7e10077-must-gather-output\") pod \"must-gather-sft27\" (UID: \"1923757d-527d-4867-9c13-f732f7e10077\") " pod="openshift-must-gather-rz8wr/must-gather-sft27" Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.598113 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzpbh\" (UniqueName: \"kubernetes.io/projected/1923757d-527d-4867-9c13-f732f7e10077-kube-api-access-pzpbh\") pod \"must-gather-sft27\" (UID: \"1923757d-527d-4867-9c13-f732f7e10077\") " pod="openshift-must-gather-rz8wr/must-gather-sft27" Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.598174 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1923757d-527d-4867-9c13-f732f7e10077-must-gather-output\") pod \"must-gather-sft27\" (UID: \"1923757d-527d-4867-9c13-f732f7e10077\") " pod="openshift-must-gather-rz8wr/must-gather-sft27" Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.598847 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1923757d-527d-4867-9c13-f732f7e10077-must-gather-output\") pod \"must-gather-sft27\" (UID: \"1923757d-527d-4867-9c13-f732f7e10077\") " pod="openshift-must-gather-rz8wr/must-gather-sft27" Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.619905 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzpbh\" (UniqueName: \"kubernetes.io/projected/1923757d-527d-4867-9c13-f732f7e10077-kube-api-access-pzpbh\") pod \"must-gather-sft27\" (UID: \"1923757d-527d-4867-9c13-f732f7e10077\") " pod="openshift-must-gather-rz8wr/must-gather-sft27" Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.760944 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/must-gather-sft27" Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.876456 4845 generic.go:334] "Generic (PLEG): container finished" podID="1b055342-d2d4-4903-8e94-cea6fe55e396" containerID="3d431906e2f3072d72d3cc6678e87e7b2c8e6ba87818f01c6f5b2ff1e8b66c7d" exitCode=0 Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.876532 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" event={"ID":"1b055342-d2d4-4903-8e94-cea6fe55e396","Type":"ContainerDied","Data":"3d431906e2f3072d72d3cc6678e87e7b2c8e6ba87818f01c6f5b2ff1e8b66c7d"} Oct 06 07:45:01 crc kubenswrapper[4845]: I1006 07:45:01.876565 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" event={"ID":"1b055342-d2d4-4903-8e94-cea6fe55e396","Type":"ContainerStarted","Data":"3b01d279981a7d2f3eabf2bc0c2ba8dc3dab7d00225bf08befa62f977644d056"} Oct 06 07:45:02 crc kubenswrapper[4845]: I1006 07:45:02.178591 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rz8wr/must-gather-sft27"] Oct 06 07:45:02 crc kubenswrapper[4845]: W1006 07:45:02.183553 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1923757d_527d_4867_9c13_f732f7e10077.slice/crio-50640948d237f86ae29dd2fc9b39c564e1f1c4b97c1d1419d13613b0301646f7 WatchSource:0}: Error finding container 50640948d237f86ae29dd2fc9b39c564e1f1c4b97c1d1419d13613b0301646f7: Status 404 returned error can't find the container with id 50640948d237f86ae29dd2fc9b39c564e1f1c4b97c1d1419d13613b0301646f7 Oct 06 07:45:02 crc kubenswrapper[4845]: I1006 07:45:02.886004 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/must-gather-sft27" event={"ID":"1923757d-527d-4867-9c13-f732f7e10077","Type":"ContainerStarted","Data":"55d343e729998928b9db295c1c683c86831a6c1d51d162e5de588bcc739d5a41"} Oct 06 07:45:02 crc kubenswrapper[4845]: I1006 07:45:02.886466 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/must-gather-sft27" event={"ID":"1923757d-527d-4867-9c13-f732f7e10077","Type":"ContainerStarted","Data":"abf4c7857653e73e2d768be14bd29ed96776fe748fe9c486ece007b3c022d904"} Oct 06 07:45:02 crc kubenswrapper[4845]: I1006 07:45:02.886485 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/must-gather-sft27" event={"ID":"1923757d-527d-4867-9c13-f732f7e10077","Type":"ContainerStarted","Data":"50640948d237f86ae29dd2fc9b39c564e1f1c4b97c1d1419d13613b0301646f7"} Oct 06 07:45:02 crc kubenswrapper[4845]: I1006 07:45:02.910118 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rz8wr/must-gather-sft27" podStartSLOduration=1.910082635 podStartE2EDuration="1.910082635s" podCreationTimestamp="2025-10-06 07:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:45:02.900008779 +0000 UTC m=+3587.414749797" watchObservedRunningTime="2025-10-06 07:45:02.910082635 +0000 UTC m=+3587.424823653" Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.288523 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.335686 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b055342-d2d4-4903-8e94-cea6fe55e396-secret-volume\") pod \"1b055342-d2d4-4903-8e94-cea6fe55e396\" (UID: \"1b055342-d2d4-4903-8e94-cea6fe55e396\") " Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.335826 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b055342-d2d4-4903-8e94-cea6fe55e396-config-volume\") pod \"1b055342-d2d4-4903-8e94-cea6fe55e396\" (UID: \"1b055342-d2d4-4903-8e94-cea6fe55e396\") " Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.335867 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6bhl\" (UniqueName: \"kubernetes.io/projected/1b055342-d2d4-4903-8e94-cea6fe55e396-kube-api-access-x6bhl\") pod \"1b055342-d2d4-4903-8e94-cea6fe55e396\" (UID: \"1b055342-d2d4-4903-8e94-cea6fe55e396\") " Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.338935 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b055342-d2d4-4903-8e94-cea6fe55e396-config-volume" (OuterVolumeSpecName: "config-volume") pod "1b055342-d2d4-4903-8e94-cea6fe55e396" (UID: "1b055342-d2d4-4903-8e94-cea6fe55e396"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.342368 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b055342-d2d4-4903-8e94-cea6fe55e396-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1b055342-d2d4-4903-8e94-cea6fe55e396" (UID: "1b055342-d2d4-4903-8e94-cea6fe55e396"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.342942 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b055342-d2d4-4903-8e94-cea6fe55e396-kube-api-access-x6bhl" (OuterVolumeSpecName: "kube-api-access-x6bhl") pod "1b055342-d2d4-4903-8e94-cea6fe55e396" (UID: "1b055342-d2d4-4903-8e94-cea6fe55e396"). InnerVolumeSpecName "kube-api-access-x6bhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.438118 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b055342-d2d4-4903-8e94-cea6fe55e396-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.438470 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6bhl\" (UniqueName: \"kubernetes.io/projected/1b055342-d2d4-4903-8e94-cea6fe55e396-kube-api-access-x6bhl\") on node \"crc\" DevicePath \"\"" Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.438481 4845 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b055342-d2d4-4903-8e94-cea6fe55e396-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.894753 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" event={"ID":"1b055342-d2d4-4903-8e94-cea6fe55e396","Type":"ContainerDied","Data":"3b01d279981a7d2f3eabf2bc0c2ba8dc3dab7d00225bf08befa62f977644d056"} Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.894795 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328945-k4txz" Oct 06 07:45:03 crc kubenswrapper[4845]: I1006 07:45:03.894804 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b01d279981a7d2f3eabf2bc0c2ba8dc3dab7d00225bf08befa62f977644d056" Oct 06 07:45:04 crc kubenswrapper[4845]: I1006 07:45:04.368309 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z"] Oct 06 07:45:04 crc kubenswrapper[4845]: I1006 07:45:04.394493 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328900-7p79z"] Oct 06 07:45:05 crc kubenswrapper[4845]: E1006 07:45:05.132098 4845 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.230:43456->38.102.83.230:37887: write tcp 38.102.83.230:43456->38.102.83.230:37887: write: broken pipe Oct 06 07:45:05 crc kubenswrapper[4845]: I1006 07:45:05.226986 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:45:05 crc kubenswrapper[4845]: E1006 07:45:05.227192 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:45:05 crc kubenswrapper[4845]: I1006 07:45:05.809712 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rz8wr/crc-debug-5fb4x"] Oct 06 07:45:05 crc kubenswrapper[4845]: E1006 07:45:05.810104 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b055342-d2d4-4903-8e94-cea6fe55e396" containerName="collect-profiles" Oct 06 07:45:05 crc kubenswrapper[4845]: I1006 07:45:05.810120 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b055342-d2d4-4903-8e94-cea6fe55e396" containerName="collect-profiles" Oct 06 07:45:05 crc kubenswrapper[4845]: I1006 07:45:05.810280 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b055342-d2d4-4903-8e94-cea6fe55e396" containerName="collect-profiles" Oct 06 07:45:05 crc kubenswrapper[4845]: I1006 07:45:05.810937 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" Oct 06 07:45:05 crc kubenswrapper[4845]: I1006 07:45:05.812772 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rz8wr"/"default-dockercfg-jgljt" Oct 06 07:45:05 crc kubenswrapper[4845]: I1006 07:45:05.883560 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbps2\" (UniqueName: \"kubernetes.io/projected/aa55cc2c-abde-46cb-b3c2-81c2a95418cb-kube-api-access-pbps2\") pod \"crc-debug-5fb4x\" (UID: \"aa55cc2c-abde-46cb-b3c2-81c2a95418cb\") " pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" Oct 06 07:45:05 crc kubenswrapper[4845]: I1006 07:45:05.883800 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa55cc2c-abde-46cb-b3c2-81c2a95418cb-host\") pod \"crc-debug-5fb4x\" (UID: \"aa55cc2c-abde-46cb-b3c2-81c2a95418cb\") " pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" Oct 06 07:45:05 crc kubenswrapper[4845]: I1006 07:45:05.985903 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa55cc2c-abde-46cb-b3c2-81c2a95418cb-host\") pod \"crc-debug-5fb4x\" (UID: \"aa55cc2c-abde-46cb-b3c2-81c2a95418cb\") " pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" Oct 06 07:45:05 crc kubenswrapper[4845]: I1006 07:45:05.986276 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbps2\" (UniqueName: \"kubernetes.io/projected/aa55cc2c-abde-46cb-b3c2-81c2a95418cb-kube-api-access-pbps2\") pod \"crc-debug-5fb4x\" (UID: \"aa55cc2c-abde-46cb-b3c2-81c2a95418cb\") " pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" Oct 06 07:45:05 crc kubenswrapper[4845]: I1006 07:45:05.986104 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa55cc2c-abde-46cb-b3c2-81c2a95418cb-host\") pod \"crc-debug-5fb4x\" (UID: \"aa55cc2c-abde-46cb-b3c2-81c2a95418cb\") " pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" Oct 06 07:45:06 crc kubenswrapper[4845]: I1006 07:45:06.017304 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbps2\" (UniqueName: \"kubernetes.io/projected/aa55cc2c-abde-46cb-b3c2-81c2a95418cb-kube-api-access-pbps2\") pod \"crc-debug-5fb4x\" (UID: \"aa55cc2c-abde-46cb-b3c2-81c2a95418cb\") " pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" Oct 06 07:45:06 crc kubenswrapper[4845]: I1006 07:45:06.128213 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" Oct 06 07:45:06 crc kubenswrapper[4845]: W1006 07:45:06.158324 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa55cc2c_abde_46cb_b3c2_81c2a95418cb.slice/crio-1c1b9601f0f2d99300f120ef7262409b87ffc3a49d852137babc92f50ca8e80e WatchSource:0}: Error finding container 1c1b9601f0f2d99300f120ef7262409b87ffc3a49d852137babc92f50ca8e80e: Status 404 returned error can't find the container with id 1c1b9601f0f2d99300f120ef7262409b87ffc3a49d852137babc92f50ca8e80e Oct 06 07:45:06 crc kubenswrapper[4845]: I1006 07:45:06.238563 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49846891-c3bb-4413-a9ba-1d58fb45faf5" path="/var/lib/kubelet/pods/49846891-c3bb-4413-a9ba-1d58fb45faf5/volumes" Oct 06 07:45:06 crc kubenswrapper[4845]: I1006 07:45:06.924198 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" event={"ID":"aa55cc2c-abde-46cb-b3c2-81c2a95418cb","Type":"ContainerStarted","Data":"b1844b1379bca977f46bfd4e12af99ab78e47adc55d25ae59031cb2425ad4e7c"} Oct 06 07:45:06 crc kubenswrapper[4845]: I1006 07:45:06.924778 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" event={"ID":"aa55cc2c-abde-46cb-b3c2-81c2a95418cb","Type":"ContainerStarted","Data":"1c1b9601f0f2d99300f120ef7262409b87ffc3a49d852137babc92f50ca8e80e"} Oct 06 07:45:06 crc kubenswrapper[4845]: I1006 07:45:06.939677 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" podStartSLOduration=1.939657295 podStartE2EDuration="1.939657295s" podCreationTimestamp="2025-10-06 07:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:45:06.936626868 +0000 UTC m=+3591.451367876" watchObservedRunningTime="2025-10-06 07:45:06.939657295 +0000 UTC m=+3591.454398293" Oct 06 07:45:20 crc kubenswrapper[4845]: I1006 07:45:20.226566 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:45:20 crc kubenswrapper[4845]: E1006 07:45:20.227435 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:45:25 crc kubenswrapper[4845]: I1006 07:45:25.172236 4845 scope.go:117] "RemoveContainer" containerID="3dc7981dec70b344ce8c0ed5c0775b7e8291db74528f1e658128f8ed00769817" Oct 06 07:45:25 crc kubenswrapper[4845]: I1006 07:45:25.200100 4845 scope.go:117] "RemoveContainer" containerID="51ee1108f15883e99ac556879c87186836d445d0844cc2a8e8007d3a7a9aa4c6" Oct 06 07:45:31 crc kubenswrapper[4845]: I1006 07:45:31.227188 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:45:31 crc kubenswrapper[4845]: E1006 07:45:31.228091 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.070355 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jpsht"] Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.072898 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.111583 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jpsht"] Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.243388 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b599966-1318-4b29-a5a6-72bb422aa78e-catalog-content\") pod \"certified-operators-jpsht\" (UID: \"1b599966-1318-4b29-a5a6-72bb422aa78e\") " pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.243821 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqjg5\" (UniqueName: \"kubernetes.io/projected/1b599966-1318-4b29-a5a6-72bb422aa78e-kube-api-access-lqjg5\") pod \"certified-operators-jpsht\" (UID: \"1b599966-1318-4b29-a5a6-72bb422aa78e\") " pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.243953 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b599966-1318-4b29-a5a6-72bb422aa78e-utilities\") pod \"certified-operators-jpsht\" (UID: \"1b599966-1318-4b29-a5a6-72bb422aa78e\") " pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.346046 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b599966-1318-4b29-a5a6-72bb422aa78e-utilities\") pod \"certified-operators-jpsht\" (UID: \"1b599966-1318-4b29-a5a6-72bb422aa78e\") " pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.346172 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b599966-1318-4b29-a5a6-72bb422aa78e-catalog-content\") pod \"certified-operators-jpsht\" (UID: \"1b599966-1318-4b29-a5a6-72bb422aa78e\") " pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.346197 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqjg5\" (UniqueName: \"kubernetes.io/projected/1b599966-1318-4b29-a5a6-72bb422aa78e-kube-api-access-lqjg5\") pod \"certified-operators-jpsht\" (UID: \"1b599966-1318-4b29-a5a6-72bb422aa78e\") " pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.347019 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b599966-1318-4b29-a5a6-72bb422aa78e-utilities\") pod \"certified-operators-jpsht\" (UID: \"1b599966-1318-4b29-a5a6-72bb422aa78e\") " pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.349335 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b599966-1318-4b29-a5a6-72bb422aa78e-catalog-content\") pod \"certified-operators-jpsht\" (UID: \"1b599966-1318-4b29-a5a6-72bb422aa78e\") " pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.375476 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqjg5\" (UniqueName: \"kubernetes.io/projected/1b599966-1318-4b29-a5a6-72bb422aa78e-kube-api-access-lqjg5\") pod \"certified-operators-jpsht\" (UID: \"1b599966-1318-4b29-a5a6-72bb422aa78e\") " pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.393993 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:35 crc kubenswrapper[4845]: I1006 07:45:35.992423 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jpsht"] Oct 06 07:45:36 crc kubenswrapper[4845]: I1006 07:45:36.162614 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsht" event={"ID":"1b599966-1318-4b29-a5a6-72bb422aa78e","Type":"ContainerStarted","Data":"7b8a0742690ca9017940421df21a8e8e0835f90ee4d9e87846ee11a64f4c544a"} Oct 06 07:45:37 crc kubenswrapper[4845]: I1006 07:45:37.174976 4845 generic.go:334] "Generic (PLEG): container finished" podID="1b599966-1318-4b29-a5a6-72bb422aa78e" containerID="2c982da1697a7e0c6ac5cbc861b2cb95a367e2337290551083a05302c6366ede" exitCode=0 Oct 06 07:45:37 crc kubenswrapper[4845]: I1006 07:45:37.175081 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsht" event={"ID":"1b599966-1318-4b29-a5a6-72bb422aa78e","Type":"ContainerDied","Data":"2c982da1697a7e0c6ac5cbc861b2cb95a367e2337290551083a05302c6366ede"} Oct 06 07:45:39 crc kubenswrapper[4845]: I1006 07:45:39.198221 4845 generic.go:334] "Generic (PLEG): container finished" podID="1b599966-1318-4b29-a5a6-72bb422aa78e" containerID="4e08f6df1224bba56f0b110638fa70ddda2767736edb9a8f027141844a007839" exitCode=0 Oct 06 07:45:39 crc kubenswrapper[4845]: I1006 07:45:39.198333 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsht" event={"ID":"1b599966-1318-4b29-a5a6-72bb422aa78e","Type":"ContainerDied","Data":"4e08f6df1224bba56f0b110638fa70ddda2767736edb9a8f027141844a007839"} Oct 06 07:45:40 crc kubenswrapper[4845]: I1006 07:45:40.213875 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsht" event={"ID":"1b599966-1318-4b29-a5a6-72bb422aa78e","Type":"ContainerStarted","Data":"c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15"} Oct 06 07:45:45 crc kubenswrapper[4845]: I1006 07:45:45.400394 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:45 crc kubenswrapper[4845]: I1006 07:45:45.401150 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:45 crc kubenswrapper[4845]: I1006 07:45:45.445784 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:45 crc kubenswrapper[4845]: I1006 07:45:45.465251 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jpsht" podStartSLOduration=7.979732686 podStartE2EDuration="10.465234448s" podCreationTimestamp="2025-10-06 07:45:35 +0000 UTC" firstStartedPulling="2025-10-06 07:45:37.177761029 +0000 UTC m=+3621.692502037" lastFinishedPulling="2025-10-06 07:45:39.663262791 +0000 UTC m=+3624.178003799" observedRunningTime="2025-10-06 07:45:40.23183939 +0000 UTC m=+3624.746580388" watchObservedRunningTime="2025-10-06 07:45:45.465234448 +0000 UTC m=+3629.979975446" Oct 06 07:45:46 crc kubenswrapper[4845]: I1006 07:45:46.231954 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:45:46 crc kubenswrapper[4845]: E1006 07:45:46.232659 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:45:46 crc kubenswrapper[4845]: I1006 07:45:46.316345 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:46 crc kubenswrapper[4845]: I1006 07:45:46.365251 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jpsht"] Oct 06 07:45:48 crc kubenswrapper[4845]: I1006 07:45:48.285196 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jpsht" podUID="1b599966-1318-4b29-a5a6-72bb422aa78e" containerName="registry-server" containerID="cri-o://c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15" gracePeriod=2 Oct 06 07:45:48 crc kubenswrapper[4845]: I1006 07:45:48.764383 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:48 crc kubenswrapper[4845]: I1006 07:45:48.920206 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b599966-1318-4b29-a5a6-72bb422aa78e-utilities\") pod \"1b599966-1318-4b29-a5a6-72bb422aa78e\" (UID: \"1b599966-1318-4b29-a5a6-72bb422aa78e\") " Oct 06 07:45:48 crc kubenswrapper[4845]: I1006 07:45:48.920598 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b599966-1318-4b29-a5a6-72bb422aa78e-catalog-content\") pod \"1b599966-1318-4b29-a5a6-72bb422aa78e\" (UID: \"1b599966-1318-4b29-a5a6-72bb422aa78e\") " Oct 06 07:45:48 crc kubenswrapper[4845]: I1006 07:45:48.920649 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqjg5\" (UniqueName: \"kubernetes.io/projected/1b599966-1318-4b29-a5a6-72bb422aa78e-kube-api-access-lqjg5\") pod \"1b599966-1318-4b29-a5a6-72bb422aa78e\" (UID: \"1b599966-1318-4b29-a5a6-72bb422aa78e\") " Oct 06 07:45:48 crc kubenswrapper[4845]: I1006 07:45:48.921164 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b599966-1318-4b29-a5a6-72bb422aa78e-utilities" (OuterVolumeSpecName: "utilities") pod "1b599966-1318-4b29-a5a6-72bb422aa78e" (UID: "1b599966-1318-4b29-a5a6-72bb422aa78e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:45:48 crc kubenswrapper[4845]: I1006 07:45:48.934675 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b599966-1318-4b29-a5a6-72bb422aa78e-kube-api-access-lqjg5" (OuterVolumeSpecName: "kube-api-access-lqjg5") pod "1b599966-1318-4b29-a5a6-72bb422aa78e" (UID: "1b599966-1318-4b29-a5a6-72bb422aa78e"). InnerVolumeSpecName "kube-api-access-lqjg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:45:48 crc kubenswrapper[4845]: I1006 07:45:48.983186 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b599966-1318-4b29-a5a6-72bb422aa78e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b599966-1318-4b29-a5a6-72bb422aa78e" (UID: "1b599966-1318-4b29-a5a6-72bb422aa78e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.022765 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b599966-1318-4b29-a5a6-72bb422aa78e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.022792 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqjg5\" (UniqueName: \"kubernetes.io/projected/1b599966-1318-4b29-a5a6-72bb422aa78e-kube-api-access-lqjg5\") on node \"crc\" DevicePath \"\"" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.022805 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b599966-1318-4b29-a5a6-72bb422aa78e-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.298016 4845 generic.go:334] "Generic (PLEG): container finished" podID="1b599966-1318-4b29-a5a6-72bb422aa78e" containerID="c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15" exitCode=0 Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.298059 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsht" event={"ID":"1b599966-1318-4b29-a5a6-72bb422aa78e","Type":"ContainerDied","Data":"c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15"} Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.298084 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsht" event={"ID":"1b599966-1318-4b29-a5a6-72bb422aa78e","Type":"ContainerDied","Data":"7b8a0742690ca9017940421df21a8e8e0835f90ee4d9e87846ee11a64f4c544a"} Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.298101 4845 scope.go:117] "RemoveContainer" containerID="c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.298233 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpsht" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.341217 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jpsht"] Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.364293 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jpsht"] Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.371567 4845 scope.go:117] "RemoveContainer" containerID="4e08f6df1224bba56f0b110638fa70ddda2767736edb9a8f027141844a007839" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.396616 4845 scope.go:117] "RemoveContainer" containerID="2c982da1697a7e0c6ac5cbc861b2cb95a367e2337290551083a05302c6366ede" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.453027 4845 scope.go:117] "RemoveContainer" containerID="c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15" Oct 06 07:45:49 crc kubenswrapper[4845]: E1006 07:45:49.453628 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15\": container with ID starting with c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15 not found: ID does not exist" containerID="c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.453673 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15"} err="failed to get container status \"c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15\": rpc error: code = NotFound desc = could not find container \"c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15\": container with ID starting with c7de5eaf64b08dcf85bde3f07474c4302f0217c726664279802e20d819d81b15 not found: ID does not exist" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.453707 4845 scope.go:117] "RemoveContainer" containerID="4e08f6df1224bba56f0b110638fa70ddda2767736edb9a8f027141844a007839" Oct 06 07:45:49 crc kubenswrapper[4845]: E1006 07:45:49.454219 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e08f6df1224bba56f0b110638fa70ddda2767736edb9a8f027141844a007839\": container with ID starting with 4e08f6df1224bba56f0b110638fa70ddda2767736edb9a8f027141844a007839 not found: ID does not exist" containerID="4e08f6df1224bba56f0b110638fa70ddda2767736edb9a8f027141844a007839" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.454317 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e08f6df1224bba56f0b110638fa70ddda2767736edb9a8f027141844a007839"} err="failed to get container status \"4e08f6df1224bba56f0b110638fa70ddda2767736edb9a8f027141844a007839\": rpc error: code = NotFound desc = could not find container \"4e08f6df1224bba56f0b110638fa70ddda2767736edb9a8f027141844a007839\": container with ID starting with 4e08f6df1224bba56f0b110638fa70ddda2767736edb9a8f027141844a007839 not found: ID does not exist" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.454424 4845 scope.go:117] "RemoveContainer" containerID="2c982da1697a7e0c6ac5cbc861b2cb95a367e2337290551083a05302c6366ede" Oct 06 07:45:49 crc kubenswrapper[4845]: E1006 07:45:49.454844 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c982da1697a7e0c6ac5cbc861b2cb95a367e2337290551083a05302c6366ede\": container with ID starting with 2c982da1697a7e0c6ac5cbc861b2cb95a367e2337290551083a05302c6366ede not found: ID does not exist" containerID="2c982da1697a7e0c6ac5cbc861b2cb95a367e2337290551083a05302c6366ede" Oct 06 07:45:49 crc kubenswrapper[4845]: I1006 07:45:49.454885 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c982da1697a7e0c6ac5cbc861b2cb95a367e2337290551083a05302c6366ede"} err="failed to get container status \"2c982da1697a7e0c6ac5cbc861b2cb95a367e2337290551083a05302c6366ede\": rpc error: code = NotFound desc = could not find container \"2c982da1697a7e0c6ac5cbc861b2cb95a367e2337290551083a05302c6366ede\": container with ID starting with 2c982da1697a7e0c6ac5cbc861b2cb95a367e2337290551083a05302c6366ede not found: ID does not exist" Oct 06 07:45:50 crc kubenswrapper[4845]: I1006 07:45:50.237130 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b599966-1318-4b29-a5a6-72bb422aa78e" path="/var/lib/kubelet/pods/1b599966-1318-4b29-a5a6-72bb422aa78e/volumes" Oct 06 07:46:01 crc kubenswrapper[4845]: I1006 07:46:01.227832 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:46:01 crc kubenswrapper[4845]: E1006 07:46:01.229419 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:46:04 crc kubenswrapper[4845]: I1006 07:46:04.420903 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56f94c67bb-rmg7r_d82278b6-977b-40db-b925-10f8d7621e7c/barbican-api/0.log" Oct 06 07:46:04 crc kubenswrapper[4845]: I1006 07:46:04.482875 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56f94c67bb-rmg7r_d82278b6-977b-40db-b925-10f8d7621e7c/barbican-api-log/0.log" Oct 06 07:46:04 crc kubenswrapper[4845]: I1006 07:46:04.660898 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c5976ff76-gcfvw_3b247c0d-1911-47a7-83bc-fad6ee6d6172/barbican-keystone-listener/0.log" Oct 06 07:46:04 crc kubenswrapper[4845]: I1006 07:46:04.697009 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c5976ff76-gcfvw_3b247c0d-1911-47a7-83bc-fad6ee6d6172/barbican-keystone-listener-log/0.log" Oct 06 07:46:04 crc kubenswrapper[4845]: I1006 07:46:04.823511 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d85949b65-6bf6r_a8cd370d-1327-4f32-a12d-e43c99f63f23/barbican-worker/0.log" Oct 06 07:46:04 crc kubenswrapper[4845]: I1006 07:46:04.879498 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d85949b65-6bf6r_a8cd370d-1327-4f32-a12d-e43c99f63f23/barbican-worker-log/0.log" Oct 06 07:46:05 crc kubenswrapper[4845]: I1006 07:46:05.087731 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-c9ms4_5ca7a9ec-05b1-46ae-bc84-065bf4904784/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:05 crc kubenswrapper[4845]: I1006 07:46:05.253458 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_66468bd9-d0ea-4117-a963-3e7fb9b3c54d/ceilometer-central-agent/0.log" Oct 06 07:46:05 crc kubenswrapper[4845]: I1006 07:46:05.278337 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_66468bd9-d0ea-4117-a963-3e7fb9b3c54d/ceilometer-notification-agent/0.log" Oct 06 07:46:05 crc kubenswrapper[4845]: I1006 07:46:05.326025 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_66468bd9-d0ea-4117-a963-3e7fb9b3c54d/proxy-httpd/0.log" Oct 06 07:46:05 crc kubenswrapper[4845]: I1006 07:46:05.422747 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_66468bd9-d0ea-4117-a963-3e7fb9b3c54d/sg-core/0.log" Oct 06 07:46:05 crc kubenswrapper[4845]: I1006 07:46:05.513488 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0cc2b546-3f23-4c16-af0c-84cce0997fe9/cinder-api/0.log" Oct 06 07:46:05 crc kubenswrapper[4845]: I1006 07:46:05.614060 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0cc2b546-3f23-4c16-af0c-84cce0997fe9/cinder-api-log/0.log" Oct 06 07:46:05 crc kubenswrapper[4845]: I1006 07:46:05.769835 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d9789203-7142-4ed7-b8db-7105d5233557/cinder-scheduler/0.log" Oct 06 07:46:05 crc kubenswrapper[4845]: I1006 07:46:05.819126 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d9789203-7142-4ed7-b8db-7105d5233557/probe/0.log" Oct 06 07:46:05 crc kubenswrapper[4845]: I1006 07:46:05.961978 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-bw8t9_4e569fb2-9612-43bd-93ab-bfad8fc42c9c/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:06 crc kubenswrapper[4845]: I1006 07:46:06.095454 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-h4msk_57c9565a-619a-48cd-af5f-1dc8141f82af/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:06 crc kubenswrapper[4845]: I1006 07:46:06.205930 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-tg64c_21b056d4-86a9-4bdc-a052-8cea0b28efac/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:06 crc kubenswrapper[4845]: I1006 07:46:06.328186 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55565d6cbc-hlnmz_f37a2a86-a24c-4fa6-9944-ba02bf209e32/init/0.log" Oct 06 07:46:06 crc kubenswrapper[4845]: I1006 07:46:06.465213 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55565d6cbc-hlnmz_f37a2a86-a24c-4fa6-9944-ba02bf209e32/init/0.log" Oct 06 07:46:06 crc kubenswrapper[4845]: I1006 07:46:06.524356 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55565d6cbc-hlnmz_f37a2a86-a24c-4fa6-9944-ba02bf209e32/dnsmasq-dns/0.log" Oct 06 07:46:06 crc kubenswrapper[4845]: I1006 07:46:06.660539 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-tqpkj_e0ba0e05-c816-48c7-9d88-a735ea82f3eb/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:06 crc kubenswrapper[4845]: I1006 07:46:06.737466 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_07e1706a-220a-4291-b2b3-1b79660ec95b/glance-httpd/0.log" Oct 06 07:46:06 crc kubenswrapper[4845]: I1006 07:46:06.835820 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_07e1706a-220a-4291-b2b3-1b79660ec95b/glance-log/0.log" Oct 06 07:46:06 crc kubenswrapper[4845]: I1006 07:46:06.948918 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_19ec12dd-d7b0-45e8-b569-887bbdf5b6fd/glance-httpd/0.log" Oct 06 07:46:07 crc kubenswrapper[4845]: I1006 07:46:07.037992 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_19ec12dd-d7b0-45e8-b569-887bbdf5b6fd/glance-log/0.log" Oct 06 07:46:07 crc kubenswrapper[4845]: I1006 07:46:07.107579 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gm46p_705e6ad6-f299-43dc-8d30-7c0bd5039250/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:07 crc kubenswrapper[4845]: I1006 07:46:07.322280 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hb7dv_a9079881-df95-4fe0-a6db-2f085d6d974e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:07 crc kubenswrapper[4845]: I1006 07:46:07.538538 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_0dfda52e-f351-49b0-93b6-e95ce8146051/kube-state-metrics/0.log" Oct 06 07:46:07 crc kubenswrapper[4845]: I1006 07:46:07.540757 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-67cb66d46f-6rxvh_6427f38b-494b-4cd7-b019-aa8db716ffe0/keystone-api/0.log" Oct 06 07:46:07 crc kubenswrapper[4845]: I1006 07:46:07.693220 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kj28v_50be375c-cf6d-4540-930c-e09f602c4045/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:08 crc kubenswrapper[4845]: I1006 07:46:08.084705 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767548595f-nndsw_f554586f-3f7f-4fe0-9a1b-0ff75662c2e2/neutron-api/0.log" Oct 06 07:46:08 crc kubenswrapper[4845]: I1006 07:46:08.236851 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-767548595f-nndsw_f554586f-3f7f-4fe0-9a1b-0ff75662c2e2/neutron-httpd/0.log" Oct 06 07:46:08 crc kubenswrapper[4845]: I1006 07:46:08.321232 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-2phbh_60f68944-a123-4f0a-ba3f-8215bf68a123/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:08 crc kubenswrapper[4845]: I1006 07:46:08.802856 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_20195d0c-d1c3-476e-86fa-2bc4d2ab39d3/nova-api-log/0.log" Oct 06 07:46:09 crc kubenswrapper[4845]: I1006 07:46:09.136545 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_20195d0c-d1c3-476e-86fa-2bc4d2ab39d3/nova-api-api/0.log" Oct 06 07:46:09 crc kubenswrapper[4845]: I1006 07:46:09.148875 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5f4ca0cf-85e3-4c26-97d1-cf6682c4d5be/nova-cell0-conductor-conductor/0.log" Oct 06 07:46:09 crc kubenswrapper[4845]: I1006 07:46:09.455815 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_db6ea194-3e38-44a9-9ac4-0182b588cee2/nova-cell1-conductor-conductor/0.log" Oct 06 07:46:09 crc kubenswrapper[4845]: I1006 07:46:09.510971 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c326f85b-5b04-4ff0-a0e4-29a1e11eefb2/nova-cell1-novncproxy-novncproxy/0.log" Oct 06 07:46:09 crc kubenswrapper[4845]: I1006 07:46:09.733036 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gftcx_6f89fdcf-abd7-4cf4-aa6f-a05ada603477/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:09 crc kubenswrapper[4845]: I1006 07:46:09.880537 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1fe4c060-2cea-4178-a1ec-33cf60f56ef8/nova-metadata-log/0.log" Oct 06 07:46:10 crc kubenswrapper[4845]: I1006 07:46:10.422887 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b3d70a5b-fbdb-4d75-bc33-6fef87a933c6/nova-scheduler-scheduler/0.log" Oct 06 07:46:10 crc kubenswrapper[4845]: I1006 07:46:10.452473 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_441b3c5d-0205-472b-8356-e10a4b5b3a4a/mysql-bootstrap/0.log" Oct 06 07:46:10 crc kubenswrapper[4845]: I1006 07:46:10.631843 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_441b3c5d-0205-472b-8356-e10a4b5b3a4a/mysql-bootstrap/0.log" Oct 06 07:46:10 crc kubenswrapper[4845]: I1006 07:46:10.683947 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_441b3c5d-0205-472b-8356-e10a4b5b3a4a/galera/0.log" Oct 06 07:46:10 crc kubenswrapper[4845]: I1006 07:46:10.912897 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8d5452f2-c63d-4287-93c4-17b89651a7c1/mysql-bootstrap/0.log" Oct 06 07:46:11 crc kubenswrapper[4845]: I1006 07:46:11.072994 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8d5452f2-c63d-4287-93c4-17b89651a7c1/mysql-bootstrap/0.log" Oct 06 07:46:11 crc kubenswrapper[4845]: I1006 07:46:11.115909 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8d5452f2-c63d-4287-93c4-17b89651a7c1/galera/0.log" Oct 06 07:46:11 crc kubenswrapper[4845]: I1006 07:46:11.296318 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ec118969-bd05-449c-bb6b-a460bda1b79a/openstackclient/0.log" Oct 06 07:46:11 crc kubenswrapper[4845]: I1006 07:46:11.345685 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1fe4c060-2cea-4178-a1ec-33cf60f56ef8/nova-metadata-metadata/0.log" Oct 06 07:46:11 crc kubenswrapper[4845]: I1006 07:46:11.545539 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4qbxx_175127a7-9d27-4976-a4bb-789072f8370c/openstack-network-exporter/0.log" Oct 06 07:46:11 crc kubenswrapper[4845]: I1006 07:46:11.737258 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-n9jwg_1e7e45f8-ca4d-473e-9c7e-12bb2626a080/ovsdb-server-init/0.log" Oct 06 07:46:11 crc kubenswrapper[4845]: I1006 07:46:11.867441 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-n9jwg_1e7e45f8-ca4d-473e-9c7e-12bb2626a080/ovsdb-server-init/0.log" Oct 06 07:46:11 crc kubenswrapper[4845]: I1006 07:46:11.891217 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-n9jwg_1e7e45f8-ca4d-473e-9c7e-12bb2626a080/ovs-vswitchd/0.log" Oct 06 07:46:11 crc kubenswrapper[4845]: I1006 07:46:11.964792 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-n9jwg_1e7e45f8-ca4d-473e-9c7e-12bb2626a080/ovsdb-server/0.log" Oct 06 07:46:12 crc kubenswrapper[4845]: I1006 07:46:12.121794 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-v4zd6_e2ee0908-39a9-4303-aad3-040a922d20a7/ovn-controller/0.log" Oct 06 07:46:12 crc kubenswrapper[4845]: I1006 07:46:12.227546 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:46:12 crc kubenswrapper[4845]: E1006 07:46:12.227818 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:46:12 crc kubenswrapper[4845]: I1006 07:46:12.322357 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fqfdd_c1428fc5-6ae1-4387-9635-69c26981be2a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:12 crc kubenswrapper[4845]: I1006 07:46:12.434481 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf5f7ffb-f69e-40fe-b8b7-157266325c88/openstack-network-exporter/0.log" Oct 06 07:46:12 crc kubenswrapper[4845]: I1006 07:46:12.525223 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf5f7ffb-f69e-40fe-b8b7-157266325c88/ovn-northd/0.log" Oct 06 07:46:12 crc kubenswrapper[4845]: I1006 07:46:12.664422 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3dad007b-9982-4f85-842c-083964cd2734/openstack-network-exporter/0.log" Oct 06 07:46:12 crc kubenswrapper[4845]: I1006 07:46:12.726831 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3dad007b-9982-4f85-842c-083964cd2734/ovsdbserver-nb/0.log" Oct 06 07:46:12 crc kubenswrapper[4845]: I1006 07:46:12.870111 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_952ffa29-f400-4b01-a4b7-282a401db753/openstack-network-exporter/0.log" Oct 06 07:46:12 crc kubenswrapper[4845]: I1006 07:46:12.938404 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_952ffa29-f400-4b01-a4b7-282a401db753/ovsdbserver-sb/0.log" Oct 06 07:46:13 crc kubenswrapper[4845]: I1006 07:46:13.198435 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8469844cbb-s9qws_0921c6da-fb36-4acf-b978-252f370ccc30/placement-api/0.log" Oct 06 07:46:13 crc kubenswrapper[4845]: I1006 07:46:13.219123 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8469844cbb-s9qws_0921c6da-fb36-4acf-b978-252f370ccc30/placement-log/0.log" Oct 06 07:46:13 crc kubenswrapper[4845]: I1006 07:46:13.452308 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_28832f5d-962f-4eef-8903-ab5061b80102/setup-container/0.log" Oct 06 07:46:13 crc kubenswrapper[4845]: I1006 07:46:13.530114 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_28832f5d-962f-4eef-8903-ab5061b80102/setup-container/0.log" Oct 06 07:46:13 crc kubenswrapper[4845]: I1006 07:46:13.605614 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_28832f5d-962f-4eef-8903-ab5061b80102/rabbitmq/0.log" Oct 06 07:46:13 crc kubenswrapper[4845]: I1006 07:46:13.701788 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c295b190-30fe-47c3-ae27-c6b809bbe058/setup-container/0.log" Oct 06 07:46:13 crc kubenswrapper[4845]: I1006 07:46:13.938725 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c295b190-30fe-47c3-ae27-c6b809bbe058/setup-container/0.log" Oct 06 07:46:13 crc kubenswrapper[4845]: I1006 07:46:13.981537 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c295b190-30fe-47c3-ae27-c6b809bbe058/rabbitmq/0.log" Oct 06 07:46:14 crc kubenswrapper[4845]: I1006 07:46:14.149867 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-nzxkw_bc1542af-e2ed-4aed-b0e9-0854b00c1320/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:14 crc kubenswrapper[4845]: I1006 07:46:14.189547 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-dvkwd_88581a77-2703-438c-a5d0-e6972d815990/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:14 crc kubenswrapper[4845]: I1006 07:46:14.471188 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-qrjqg_4358b521-4e22-42e2-9844-79612bf845b8/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:14 crc kubenswrapper[4845]: I1006 07:46:14.645914 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dmcpx_c482754d-cc4d-4480-b2c3-1ae079c9222b/ssh-known-hosts-edpm-deployment/0.log" Oct 06 07:46:14 crc kubenswrapper[4845]: I1006 07:46:14.688823 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bcz6n_88aa0357-70ea-4f4d-80e7-952615d772fe/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:14 crc kubenswrapper[4845]: I1006 07:46:14.939035 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-865c95569-jxblm_f3a28a2d-4deb-408d-b47b-600758782cdf/proxy-server/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.013599 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-865c95569-jxblm_f3a28a2d-4deb-408d-b47b-600758782cdf/proxy-httpd/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.158170 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-86cb9_955c6fb6-c2a5-48cd-8680-632f32157e5c/swift-ring-rebalance/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.288649 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/account-auditor/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.356652 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/account-reaper/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.474346 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/account-server/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.491388 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/account-replicator/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.537592 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/container-auditor/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.687065 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/container-replicator/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.701306 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/container-server/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.703728 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/container-updater/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.875011 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/object-auditor/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.885994 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/object-expirer/0.log" Oct 06 07:46:15 crc kubenswrapper[4845]: I1006 07:46:15.937835 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/object-replicator/0.log" Oct 06 07:46:16 crc kubenswrapper[4845]: I1006 07:46:16.059783 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/object-server/0.log" Oct 06 07:46:16 crc kubenswrapper[4845]: I1006 07:46:16.081260 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/object-updater/0.log" Oct 06 07:46:16 crc kubenswrapper[4845]: I1006 07:46:16.143411 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/rsync/0.log" Oct 06 07:46:16 crc kubenswrapper[4845]: I1006 07:46:16.279124 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ede02a6f-9a89-4a1d-960d-10490334fbd7/swift-recon-cron/0.log" Oct 06 07:46:16 crc kubenswrapper[4845]: I1006 07:46:16.393456 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-zjbw9_81f752a1-1104-4b71-9a89-1c3961584f6f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:16 crc kubenswrapper[4845]: I1006 07:46:16.533191 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_1fdf5d3a-7d9e-4702-ae15-1373bbd94574/tempest-tests-tempest-tests-runner/0.log" Oct 06 07:46:16 crc kubenswrapper[4845]: I1006 07:46:16.711819 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_a918bcf8-062d-4a05-8f1b-bc24088f12b7/test-operator-logs-container/0.log" Oct 06 07:46:16 crc kubenswrapper[4845]: I1006 07:46:16.860102 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-qchpk_f9e93bbd-d62e-451f-bea9-c6a926c912a6/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 07:46:23 crc kubenswrapper[4845]: I1006 07:46:23.676902 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_08494b3a-f49b-49af-8e06-df5f4fac3171/memcached/0.log" Oct 06 07:46:27 crc kubenswrapper[4845]: I1006 07:46:27.227648 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:46:27 crc kubenswrapper[4845]: E1006 07:46:27.228276 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:46:40 crc kubenswrapper[4845]: I1006 07:46:40.227611 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:46:40 crc kubenswrapper[4845]: E1006 07:46:40.228518 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:46:49 crc kubenswrapper[4845]: I1006 07:46:49.832247 4845 generic.go:334] "Generic (PLEG): container finished" podID="aa55cc2c-abde-46cb-b3c2-81c2a95418cb" containerID="b1844b1379bca977f46bfd4e12af99ab78e47adc55d25ae59031cb2425ad4e7c" exitCode=0 Oct 06 07:46:49 crc kubenswrapper[4845]: I1006 07:46:49.832398 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" event={"ID":"aa55cc2c-abde-46cb-b3c2-81c2a95418cb","Type":"ContainerDied","Data":"b1844b1379bca977f46bfd4e12af99ab78e47adc55d25ae59031cb2425ad4e7c"} Oct 06 07:46:50 crc kubenswrapper[4845]: I1006 07:46:50.961593 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" Oct 06 07:46:50 crc kubenswrapper[4845]: I1006 07:46:50.992494 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rz8wr/crc-debug-5fb4x"] Oct 06 07:46:50 crc kubenswrapper[4845]: I1006 07:46:50.998939 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rz8wr/crc-debug-5fb4x"] Oct 06 07:46:51 crc kubenswrapper[4845]: I1006 07:46:51.061815 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa55cc2c-abde-46cb-b3c2-81c2a95418cb-host\") pod \"aa55cc2c-abde-46cb-b3c2-81c2a95418cb\" (UID: \"aa55cc2c-abde-46cb-b3c2-81c2a95418cb\") " Oct 06 07:46:51 crc kubenswrapper[4845]: I1006 07:46:51.061928 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbps2\" (UniqueName: \"kubernetes.io/projected/aa55cc2c-abde-46cb-b3c2-81c2a95418cb-kube-api-access-pbps2\") pod \"aa55cc2c-abde-46cb-b3c2-81c2a95418cb\" (UID: \"aa55cc2c-abde-46cb-b3c2-81c2a95418cb\") " Oct 06 07:46:51 crc kubenswrapper[4845]: I1006 07:46:51.061951 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa55cc2c-abde-46cb-b3c2-81c2a95418cb-host" (OuterVolumeSpecName: "host") pod "aa55cc2c-abde-46cb-b3c2-81c2a95418cb" (UID: "aa55cc2c-abde-46cb-b3c2-81c2a95418cb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:46:51 crc kubenswrapper[4845]: I1006 07:46:51.062476 4845 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa55cc2c-abde-46cb-b3c2-81c2a95418cb-host\") on node \"crc\" DevicePath \"\"" Oct 06 07:46:51 crc kubenswrapper[4845]: I1006 07:46:51.067774 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa55cc2c-abde-46cb-b3c2-81c2a95418cb-kube-api-access-pbps2" (OuterVolumeSpecName: "kube-api-access-pbps2") pod "aa55cc2c-abde-46cb-b3c2-81c2a95418cb" (UID: "aa55cc2c-abde-46cb-b3c2-81c2a95418cb"). InnerVolumeSpecName "kube-api-access-pbps2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:46:51 crc kubenswrapper[4845]: I1006 07:46:51.164353 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbps2\" (UniqueName: \"kubernetes.io/projected/aa55cc2c-abde-46cb-b3c2-81c2a95418cb-kube-api-access-pbps2\") on node \"crc\" DevicePath \"\"" Oct 06 07:46:51 crc kubenswrapper[4845]: I1006 07:46:51.852095 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c1b9601f0f2d99300f120ef7262409b87ffc3a49d852137babc92f50ca8e80e" Oct 06 07:46:51 crc kubenswrapper[4845]: I1006 07:46:51.852144 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/crc-debug-5fb4x" Oct 06 07:46:52 crc kubenswrapper[4845]: E1006 07:46:52.001752 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa55cc2c_abde_46cb_b3c2_81c2a95418cb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa55cc2c_abde_46cb_b3c2_81c2a95418cb.slice/crio-1c1b9601f0f2d99300f120ef7262409b87ffc3a49d852137babc92f50ca8e80e\": RecentStats: unable to find data in memory cache]" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.158548 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rz8wr/crc-debug-5zj5n"] Oct 06 07:46:52 crc kubenswrapper[4845]: E1006 07:46:52.159189 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b599966-1318-4b29-a5a6-72bb422aa78e" containerName="extract-utilities" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.159206 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b599966-1318-4b29-a5a6-72bb422aa78e" containerName="extract-utilities" Oct 06 07:46:52 crc kubenswrapper[4845]: E1006 07:46:52.159216 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b599966-1318-4b29-a5a6-72bb422aa78e" containerName="registry-server" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.159222 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b599966-1318-4b29-a5a6-72bb422aa78e" containerName="registry-server" Oct 06 07:46:52 crc kubenswrapper[4845]: E1006 07:46:52.159237 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b599966-1318-4b29-a5a6-72bb422aa78e" containerName="extract-content" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.159243 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b599966-1318-4b29-a5a6-72bb422aa78e" containerName="extract-content" Oct 06 07:46:52 crc kubenswrapper[4845]: E1006 07:46:52.159259 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa55cc2c-abde-46cb-b3c2-81c2a95418cb" containerName="container-00" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.159264 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa55cc2c-abde-46cb-b3c2-81c2a95418cb" containerName="container-00" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.159489 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b599966-1318-4b29-a5a6-72bb422aa78e" containerName="registry-server" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.159505 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa55cc2c-abde-46cb-b3c2-81c2a95418cb" containerName="container-00" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.160077 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.161713 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rz8wr"/"default-dockercfg-jgljt" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.237465 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa55cc2c-abde-46cb-b3c2-81c2a95418cb" path="/var/lib/kubelet/pods/aa55cc2c-abde-46cb-b3c2-81c2a95418cb/volumes" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.286043 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ab479a3-ca9b-4847-8a08-882583cd7ace-host\") pod \"crc-debug-5zj5n\" (UID: \"2ab479a3-ca9b-4847-8a08-882583cd7ace\") " pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.286181 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zhbt\" (UniqueName: \"kubernetes.io/projected/2ab479a3-ca9b-4847-8a08-882583cd7ace-kube-api-access-5zhbt\") pod \"crc-debug-5zj5n\" (UID: \"2ab479a3-ca9b-4847-8a08-882583cd7ace\") " pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.387711 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ab479a3-ca9b-4847-8a08-882583cd7ace-host\") pod \"crc-debug-5zj5n\" (UID: \"2ab479a3-ca9b-4847-8a08-882583cd7ace\") " pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.387842 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zhbt\" (UniqueName: \"kubernetes.io/projected/2ab479a3-ca9b-4847-8a08-882583cd7ace-kube-api-access-5zhbt\") pod \"crc-debug-5zj5n\" (UID: \"2ab479a3-ca9b-4847-8a08-882583cd7ace\") " pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.387845 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ab479a3-ca9b-4847-8a08-882583cd7ace-host\") pod \"crc-debug-5zj5n\" (UID: \"2ab479a3-ca9b-4847-8a08-882583cd7ace\") " pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.406019 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zhbt\" (UniqueName: \"kubernetes.io/projected/2ab479a3-ca9b-4847-8a08-882583cd7ace-kube-api-access-5zhbt\") pod \"crc-debug-5zj5n\" (UID: \"2ab479a3-ca9b-4847-8a08-882583cd7ace\") " pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.477058 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.861520 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" event={"ID":"2ab479a3-ca9b-4847-8a08-882583cd7ace","Type":"ContainerStarted","Data":"6f92ea168982d24873d7c5c49ef355e68b44fee3cd3fd09ecb78c7c6d34b1590"} Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.861572 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" event={"ID":"2ab479a3-ca9b-4847-8a08-882583cd7ace","Type":"ContainerStarted","Data":"c0320ea113c77ceb66600950f8ee87cf71a5a1431b33cadc4c14d76d57ba7a4f"} Oct 06 07:46:52 crc kubenswrapper[4845]: I1006 07:46:52.875284 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" podStartSLOduration=0.875261664 podStartE2EDuration="875.261664ms" podCreationTimestamp="2025-10-06 07:46:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 07:46:52.87195513 +0000 UTC m=+3697.386696138" watchObservedRunningTime="2025-10-06 07:46:52.875261664 +0000 UTC m=+3697.390002682" Oct 06 07:46:53 crc kubenswrapper[4845]: I1006 07:46:53.872093 4845 generic.go:334] "Generic (PLEG): container finished" podID="2ab479a3-ca9b-4847-8a08-882583cd7ace" containerID="6f92ea168982d24873d7c5c49ef355e68b44fee3cd3fd09ecb78c7c6d34b1590" exitCode=0 Oct 06 07:46:53 crc kubenswrapper[4845]: I1006 07:46:53.872358 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" event={"ID":"2ab479a3-ca9b-4847-8a08-882583cd7ace","Type":"ContainerDied","Data":"6f92ea168982d24873d7c5c49ef355e68b44fee3cd3fd09ecb78c7c6d34b1590"} Oct 06 07:46:54 crc kubenswrapper[4845]: I1006 07:46:54.226556 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:46:54 crc kubenswrapper[4845]: E1006 07:46:54.226830 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:46:55 crc kubenswrapper[4845]: I1006 07:46:55.016716 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" Oct 06 07:46:55 crc kubenswrapper[4845]: I1006 07:46:55.151216 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ab479a3-ca9b-4847-8a08-882583cd7ace-host\") pod \"2ab479a3-ca9b-4847-8a08-882583cd7ace\" (UID: \"2ab479a3-ca9b-4847-8a08-882583cd7ace\") " Oct 06 07:46:55 crc kubenswrapper[4845]: I1006 07:46:55.151338 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zhbt\" (UniqueName: \"kubernetes.io/projected/2ab479a3-ca9b-4847-8a08-882583cd7ace-kube-api-access-5zhbt\") pod \"2ab479a3-ca9b-4847-8a08-882583cd7ace\" (UID: \"2ab479a3-ca9b-4847-8a08-882583cd7ace\") " Oct 06 07:46:55 crc kubenswrapper[4845]: I1006 07:46:55.151418 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ab479a3-ca9b-4847-8a08-882583cd7ace-host" (OuterVolumeSpecName: "host") pod "2ab479a3-ca9b-4847-8a08-882583cd7ace" (UID: "2ab479a3-ca9b-4847-8a08-882583cd7ace"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:46:55 crc kubenswrapper[4845]: I1006 07:46:55.151899 4845 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ab479a3-ca9b-4847-8a08-882583cd7ace-host\") on node \"crc\" DevicePath \"\"" Oct 06 07:46:55 crc kubenswrapper[4845]: I1006 07:46:55.157626 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab479a3-ca9b-4847-8a08-882583cd7ace-kube-api-access-5zhbt" (OuterVolumeSpecName: "kube-api-access-5zhbt") pod "2ab479a3-ca9b-4847-8a08-882583cd7ace" (UID: "2ab479a3-ca9b-4847-8a08-882583cd7ace"). InnerVolumeSpecName "kube-api-access-5zhbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:46:55 crc kubenswrapper[4845]: I1006 07:46:55.253219 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zhbt\" (UniqueName: \"kubernetes.io/projected/2ab479a3-ca9b-4847-8a08-882583cd7ace-kube-api-access-5zhbt\") on node \"crc\" DevicePath \"\"" Oct 06 07:46:55 crc kubenswrapper[4845]: I1006 07:46:55.889732 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" event={"ID":"2ab479a3-ca9b-4847-8a08-882583cd7ace","Type":"ContainerDied","Data":"c0320ea113c77ceb66600950f8ee87cf71a5a1431b33cadc4c14d76d57ba7a4f"} Oct 06 07:46:55 crc kubenswrapper[4845]: I1006 07:46:55.889779 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0320ea113c77ceb66600950f8ee87cf71a5a1431b33cadc4c14d76d57ba7a4f" Oct 06 07:46:55 crc kubenswrapper[4845]: I1006 07:46:55.890249 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/crc-debug-5zj5n" Oct 06 07:46:59 crc kubenswrapper[4845]: I1006 07:46:59.540956 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rz8wr/crc-debug-5zj5n"] Oct 06 07:46:59 crc kubenswrapper[4845]: I1006 07:46:59.547271 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rz8wr/crc-debug-5zj5n"] Oct 06 07:47:00 crc kubenswrapper[4845]: I1006 07:47:00.240023 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab479a3-ca9b-4847-8a08-882583cd7ace" path="/var/lib/kubelet/pods/2ab479a3-ca9b-4847-8a08-882583cd7ace/volumes" Oct 06 07:47:00 crc kubenswrapper[4845]: I1006 07:47:00.693107 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rz8wr/crc-debug-nqkc9"] Oct 06 07:47:00 crc kubenswrapper[4845]: E1006 07:47:00.693564 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab479a3-ca9b-4847-8a08-882583cd7ace" containerName="container-00" Oct 06 07:47:00 crc kubenswrapper[4845]: I1006 07:47:00.693581 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab479a3-ca9b-4847-8a08-882583cd7ace" containerName="container-00" Oct 06 07:47:00 crc kubenswrapper[4845]: I1006 07:47:00.693841 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab479a3-ca9b-4847-8a08-882583cd7ace" containerName="container-00" Oct 06 07:47:00 crc kubenswrapper[4845]: I1006 07:47:00.694618 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/crc-debug-nqkc9" Oct 06 07:47:00 crc kubenswrapper[4845]: I1006 07:47:00.706549 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rz8wr"/"default-dockercfg-jgljt" Oct 06 07:47:00 crc kubenswrapper[4845]: I1006 07:47:00.838135 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59bsn\" (UniqueName: \"kubernetes.io/projected/a804f975-ee7e-4db7-ad85-80b9d6a06ee9-kube-api-access-59bsn\") pod \"crc-debug-nqkc9\" (UID: \"a804f975-ee7e-4db7-ad85-80b9d6a06ee9\") " pod="openshift-must-gather-rz8wr/crc-debug-nqkc9" Oct 06 07:47:00 crc kubenswrapper[4845]: I1006 07:47:00.838262 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a804f975-ee7e-4db7-ad85-80b9d6a06ee9-host\") pod \"crc-debug-nqkc9\" (UID: \"a804f975-ee7e-4db7-ad85-80b9d6a06ee9\") " pod="openshift-must-gather-rz8wr/crc-debug-nqkc9" Oct 06 07:47:00 crc kubenswrapper[4845]: I1006 07:47:00.940099 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a804f975-ee7e-4db7-ad85-80b9d6a06ee9-host\") pod \"crc-debug-nqkc9\" (UID: \"a804f975-ee7e-4db7-ad85-80b9d6a06ee9\") " pod="openshift-must-gather-rz8wr/crc-debug-nqkc9" Oct 06 07:47:00 crc kubenswrapper[4845]: I1006 07:47:00.940238 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a804f975-ee7e-4db7-ad85-80b9d6a06ee9-host\") pod \"crc-debug-nqkc9\" (UID: \"a804f975-ee7e-4db7-ad85-80b9d6a06ee9\") " pod="openshift-must-gather-rz8wr/crc-debug-nqkc9" Oct 06 07:47:00 crc kubenswrapper[4845]: I1006 07:47:00.940281 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59bsn\" (UniqueName: \"kubernetes.io/projected/a804f975-ee7e-4db7-ad85-80b9d6a06ee9-kube-api-access-59bsn\") pod \"crc-debug-nqkc9\" (UID: \"a804f975-ee7e-4db7-ad85-80b9d6a06ee9\") " pod="openshift-must-gather-rz8wr/crc-debug-nqkc9" Oct 06 07:47:00 crc kubenswrapper[4845]: I1006 07:47:00.958886 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59bsn\" (UniqueName: \"kubernetes.io/projected/a804f975-ee7e-4db7-ad85-80b9d6a06ee9-kube-api-access-59bsn\") pod \"crc-debug-nqkc9\" (UID: \"a804f975-ee7e-4db7-ad85-80b9d6a06ee9\") " pod="openshift-must-gather-rz8wr/crc-debug-nqkc9" Oct 06 07:47:01 crc kubenswrapper[4845]: I1006 07:47:01.016912 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/crc-debug-nqkc9" Oct 06 07:47:01 crc kubenswrapper[4845]: I1006 07:47:01.948675 4845 generic.go:334] "Generic (PLEG): container finished" podID="a804f975-ee7e-4db7-ad85-80b9d6a06ee9" containerID="9218950284e6e19a80c6ac9ae885ab3877f546ca429ca0a4f7a6a35a598b57e3" exitCode=0 Oct 06 07:47:01 crc kubenswrapper[4845]: I1006 07:47:01.948904 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/crc-debug-nqkc9" event={"ID":"a804f975-ee7e-4db7-ad85-80b9d6a06ee9","Type":"ContainerDied","Data":"9218950284e6e19a80c6ac9ae885ab3877f546ca429ca0a4f7a6a35a598b57e3"} Oct 06 07:47:01 crc kubenswrapper[4845]: I1006 07:47:01.949278 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/crc-debug-nqkc9" event={"ID":"a804f975-ee7e-4db7-ad85-80b9d6a06ee9","Type":"ContainerStarted","Data":"a026fdabdaae309e53667390eca8ac0096402cca10e40d3cb4ce3a2cfb00c621"} Oct 06 07:47:01 crc kubenswrapper[4845]: I1006 07:47:01.995734 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rz8wr/crc-debug-nqkc9"] Oct 06 07:47:02 crc kubenswrapper[4845]: I1006 07:47:02.002486 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rz8wr/crc-debug-nqkc9"] Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.057536 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/crc-debug-nqkc9" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.181699 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a804f975-ee7e-4db7-ad85-80b9d6a06ee9-host\") pod \"a804f975-ee7e-4db7-ad85-80b9d6a06ee9\" (UID: \"a804f975-ee7e-4db7-ad85-80b9d6a06ee9\") " Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.181812 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59bsn\" (UniqueName: \"kubernetes.io/projected/a804f975-ee7e-4db7-ad85-80b9d6a06ee9-kube-api-access-59bsn\") pod \"a804f975-ee7e-4db7-ad85-80b9d6a06ee9\" (UID: \"a804f975-ee7e-4db7-ad85-80b9d6a06ee9\") " Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.182550 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a804f975-ee7e-4db7-ad85-80b9d6a06ee9-host" (OuterVolumeSpecName: "host") pod "a804f975-ee7e-4db7-ad85-80b9d6a06ee9" (UID: "a804f975-ee7e-4db7-ad85-80b9d6a06ee9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.187128 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a804f975-ee7e-4db7-ad85-80b9d6a06ee9-kube-api-access-59bsn" (OuterVolumeSpecName: "kube-api-access-59bsn") pod "a804f975-ee7e-4db7-ad85-80b9d6a06ee9" (UID: "a804f975-ee7e-4db7-ad85-80b9d6a06ee9"). InnerVolumeSpecName "kube-api-access-59bsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.284412 4845 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a804f975-ee7e-4db7-ad85-80b9d6a06ee9-host\") on node \"crc\" DevicePath \"\"" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.284923 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59bsn\" (UniqueName: \"kubernetes.io/projected/a804f975-ee7e-4db7-ad85-80b9d6a06ee9-kube-api-access-59bsn\") on node \"crc\" DevicePath \"\"" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.365092 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/util/0.log" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.545292 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/util/0.log" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.555347 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/pull/0.log" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.623558 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/pull/0.log" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.749341 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/util/0.log" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.769190 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/pull/0.log" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.785654 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8rtwzl_609d88e3-66d9-4f44-a539-2b6c35886f06/extract/0.log" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.940354 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5b974f6766-qzbsb_2e4d8467-600b-4ae3-8b1a-c4f0416718f7/kube-rbac-proxy/0.log" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.958752 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5b974f6766-qzbsb_2e4d8467-600b-4ae3-8b1a-c4f0416718f7/manager/0.log" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.972125 4845 scope.go:117] "RemoveContainer" containerID="9218950284e6e19a80c6ac9ae885ab3877f546ca429ca0a4f7a6a35a598b57e3" Oct 06 07:47:03 crc kubenswrapper[4845]: I1006 07:47:03.972138 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/crc-debug-nqkc9" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.005662 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-84bd8f6848-mvrqc_4629df36-951c-461d-9b54-c69cbec8bcd5/kube-rbac-proxy/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.135142 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-hf7mm_0a59590c-5261-403e-a7e3-0e726a025412/kube-rbac-proxy/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.167570 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-84bd8f6848-mvrqc_4629df36-951c-461d-9b54-c69cbec8bcd5/manager/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.218552 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-hf7mm_0a59590c-5261-403e-a7e3-0e726a025412/manager/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.236466 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a804f975-ee7e-4db7-ad85-80b9d6a06ee9" path="/var/lib/kubelet/pods/a804f975-ee7e-4db7-ad85-80b9d6a06ee9/volumes" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.338198 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-698456cdc6-kdbk5_540c7899-a612-4745-8c42-02033d088f73/kube-rbac-proxy/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.446196 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-698456cdc6-kdbk5_540c7899-a612-4745-8c42-02033d088f73/manager/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.558806 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5c497dbdb-nw58b_5166cd91-6a8c-4b81-b311-2a1e561928d3/kube-rbac-proxy/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.580067 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5c497dbdb-nw58b_5166cd91-6a8c-4b81-b311-2a1e561928d3/manager/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.654754 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6675647785-4xx5n_c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd/kube-rbac-proxy/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.752807 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6675647785-4xx5n_c3eb4ea2-4738-4a6e-9eed-f41f7a616cdd/manager/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.788717 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84788b6bc5-s8lqw_3f91fed8-7759-443b-869a-886f63b42502/kube-rbac-proxy/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.953108 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f5894c49f-lx8hg_01476e52-bab1-4b3b-b2d9-3a9d9469c943/manager/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.958677 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84788b6bc5-s8lqw_3f91fed8-7759-443b-869a-886f63b42502/manager/0.log" Oct 06 07:47:04 crc kubenswrapper[4845]: I1006 07:47:04.972878 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f5894c49f-lx8hg_01476e52-bab1-4b3b-b2d9-3a9d9469c943/kube-rbac-proxy/0.log" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.110475 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-57c9cdcf57-tf759_09e0aaab-0038-4f9b-881e-3774781f2825/kube-rbac-proxy/0.log" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.195085 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-57c9cdcf57-tf759_09e0aaab-0038-4f9b-881e-3774781f2825/manager/0.log" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.226543 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:47:05 crc kubenswrapper[4845]: E1006 07:47:05.226780 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.271884 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cb48dbc-6vhjt_12269390-1665-4934-8695-eab596535e81/manager/0.log" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.302986 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cb48dbc-6vhjt_12269390-1665-4934-8695-eab596535e81/kube-rbac-proxy/0.log" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.423272 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-d6c9dc5bc-wghng_223d7355-4741-4d59-b6cf-e71702ddc20e/kube-rbac-proxy/0.log" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.461807 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-d6c9dc5bc-wghng_223d7355-4741-4d59-b6cf-e71702ddc20e/manager/0.log" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.544515 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-69b956fbf6-c4htj_49f6ba5b-4750-418f-ac64-574d92bf6f61/kube-rbac-proxy/0.log" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.658120 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-69b956fbf6-c4htj_49f6ba5b-4750-418f-ac64-574d92bf6f61/manager/0.log" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.709505 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6c9b57c67-wrcqz_93be683f-25a1-477e-b676-5bc7be2c3bf8/kube-rbac-proxy/0.log" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.818614 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6c9b57c67-wrcqz_93be683f-25a1-477e-b676-5bc7be2c3bf8/manager/0.log" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.873349 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f59f9d8-t66dv_58567cef-7ff1-455c-a1d3-1f7a6f35a504/kube-rbac-proxy/0.log" Oct 06 07:47:05 crc kubenswrapper[4845]: I1006 07:47:05.944934 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f59f9d8-t66dv_58567cef-7ff1-455c-a1d3-1f7a6f35a504/manager/0.log" Oct 06 07:47:06 crc kubenswrapper[4845]: I1006 07:47:06.072609 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-66cc85b5d582qhb_1877f632-ca26-4045-a192-08d2f0f97a4e/manager/0.log" Oct 06 07:47:06 crc kubenswrapper[4845]: I1006 07:47:06.072737 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-66cc85b5d582qhb_1877f632-ca26-4045-a192-08d2f0f97a4e/kube-rbac-proxy/0.log" Oct 06 07:47:06 crc kubenswrapper[4845]: I1006 07:47:06.199761 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cfc658b9-dlvb8_05de66df-c97e-40d4-a605-188b3d8e66eb/kube-rbac-proxy/0.log" Oct 06 07:47:06 crc kubenswrapper[4845]: I1006 07:47:06.305910 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-677d5bb784-zdlwp_f8c7a5a4-11d3-4d38-95d5-fb90e97378b9/kube-rbac-proxy/0.log" Oct 06 07:47:06 crc kubenswrapper[4845]: I1006 07:47:06.602674 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-677d5bb784-zdlwp_f8c7a5a4-11d3-4d38-95d5-fb90e97378b9/operator/0.log" Oct 06 07:47:06 crc kubenswrapper[4845]: I1006 07:47:06.634879 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-k8dzl_6cbeaf69-7c1e-472a-8383-209ac778658e/registry-server/0.log" Oct 06 07:47:06 crc kubenswrapper[4845]: I1006 07:47:06.781458 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-c968bb45-h4m5g_8342c3c4-6f77-4647-a4c2-9f834c55ee19/kube-rbac-proxy/0.log" Oct 06 07:47:06 crc kubenswrapper[4845]: I1006 07:47:06.946191 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-c968bb45-h4m5g_8342c3c4-6f77-4647-a4c2-9f834c55ee19/manager/0.log" Oct 06 07:47:07 crc kubenswrapper[4845]: I1006 07:47:07.027290 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-66f6d6849b-5qmws_c21520ff-b41e-4433-8656-d248f0975c60/manager/0.log" Oct 06 07:47:07 crc kubenswrapper[4845]: I1006 07:47:07.043728 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-66f6d6849b-5qmws_c21520ff-b41e-4433-8656-d248f0975c60/kube-rbac-proxy/0.log" Oct 06 07:47:07 crc kubenswrapper[4845]: I1006 07:47:07.174318 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-c4dsw_880d791a-bcb7-4f71-8a16-015bd26af4d9/operator/0.log" Oct 06 07:47:07 crc kubenswrapper[4845]: I1006 07:47:07.291574 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-9v85j_38e8455f-f063-46aa-8275-20b6d80aa9ea/kube-rbac-proxy/0.log" Oct 06 07:47:07 crc kubenswrapper[4845]: I1006 07:47:07.350040 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cfc658b9-dlvb8_05de66df-c97e-40d4-a605-188b3d8e66eb/manager/0.log" Oct 06 07:47:07 crc kubenswrapper[4845]: I1006 07:47:07.454712 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-9v85j_38e8455f-f063-46aa-8275-20b6d80aa9ea/manager/0.log" Oct 06 07:47:07 crc kubenswrapper[4845]: I1006 07:47:07.504057 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f589c7597-c2mlw_e3721fe9-cb88-4326-aa90-2d08b909515e/kube-rbac-proxy/0.log" Oct 06 07:47:07 crc kubenswrapper[4845]: I1006 07:47:07.562964 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f589c7597-c2mlw_e3721fe9-cb88-4326-aa90-2d08b909515e/manager/0.log" Oct 06 07:47:07 crc kubenswrapper[4845]: I1006 07:47:07.670970 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-nsmkl_c921a4a9-e09a-4fd2-965e-c13f7fee169e/manager/0.log" Oct 06 07:47:07 crc kubenswrapper[4845]: I1006 07:47:07.703246 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-nsmkl_c921a4a9-e09a-4fd2-965e-c13f7fee169e/kube-rbac-proxy/0.log" Oct 06 07:47:07 crc kubenswrapper[4845]: I1006 07:47:07.726405 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d98cc5575-mj4wz_dcb43cda-fbc6-4092-bf5d-296858e233cd/kube-rbac-proxy/0.log" Oct 06 07:47:07 crc kubenswrapper[4845]: I1006 07:47:07.775883 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d98cc5575-mj4wz_dcb43cda-fbc6-4092-bf5d-296858e233cd/manager/0.log" Oct 06 07:47:18 crc kubenswrapper[4845]: I1006 07:47:18.226848 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:47:18 crc kubenswrapper[4845]: E1006 07:47:18.227663 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:47:22 crc kubenswrapper[4845]: I1006 07:47:22.712359 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-g4k48_38106f1d-b5d4-4d89-b79b-2a6173fe76c2/control-plane-machine-set-operator/0.log" Oct 06 07:47:22 crc kubenswrapper[4845]: I1006 07:47:22.990238 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-v6457_c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e/kube-rbac-proxy/0.log" Oct 06 07:47:23 crc kubenswrapper[4845]: I1006 07:47:23.009151 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-v6457_c7bbfaf8-1ad7-4f2a-b83a-cac75df1ba0e/machine-api-operator/0.log" Oct 06 07:47:25 crc kubenswrapper[4845]: I1006 07:47:25.393613 4845 scope.go:117] "RemoveContainer" containerID="22244d55004bbb4d5a058c2a963d112cd372c9d5b213a9b38f5775ece70666e8" Oct 06 07:47:30 crc kubenswrapper[4845]: I1006 07:47:30.227825 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:47:30 crc kubenswrapper[4845]: E1006 07:47:30.228656 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:47:33 crc kubenswrapper[4845]: I1006 07:47:33.231311 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-s9n2l_d6f5fa25-c3ab-47a0-8541-dfdf1ed2d29e/cert-manager-controller/0.log" Oct 06 07:47:33 crc kubenswrapper[4845]: I1006 07:47:33.389914 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-g57wq_a2715e14-d1ac-4227-b55e-ad1207b34e92/cert-manager-cainjector/0.log" Oct 06 07:47:33 crc kubenswrapper[4845]: I1006 07:47:33.500542 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-jtnq6_f555f420-64f8-46d7-a41a-d24e3257aea5/cert-manager-webhook/0.log" Oct 06 07:47:41 crc kubenswrapper[4845]: I1006 07:47:41.227082 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:47:41 crc kubenswrapper[4845]: E1006 07:47:41.227910 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:47:44 crc kubenswrapper[4845]: I1006 07:47:44.608774 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-grzq8_e39da7c7-c03a-458b-9f86-ef7a914b900a/nmstate-console-plugin/0.log" Oct 06 07:47:44 crc kubenswrapper[4845]: I1006 07:47:44.771777 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bs789_a727e063-0f73-4a4e-8f8b-4ed68ebc1e3b/nmstate-handler/0.log" Oct 06 07:47:44 crc kubenswrapper[4845]: I1006 07:47:44.836479 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-qsw2n_f7a8f638-cc48-4cca-965a-c3d16476963c/kube-rbac-proxy/0.log" Oct 06 07:47:44 crc kubenswrapper[4845]: I1006 07:47:44.966449 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-qsw2n_f7a8f638-cc48-4cca-965a-c3d16476963c/nmstate-metrics/0.log" Oct 06 07:47:45 crc kubenswrapper[4845]: I1006 07:47:45.583947 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-8lmwp_55d7abcf-08e7-4529-8c8c-1f45ab1ea688/nmstate-operator/0.log" Oct 06 07:47:45 crc kubenswrapper[4845]: I1006 07:47:45.755959 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-s52dg_9d129de3-ab21-48ea-a89c-0c59574eb288/nmstate-webhook/0.log" Oct 06 07:47:52 crc kubenswrapper[4845]: I1006 07:47:52.227757 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:47:52 crc kubenswrapper[4845]: E1006 07:47:52.228814 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:47:59 crc kubenswrapper[4845]: I1006 07:47:59.273076 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-xbpss_96184c2a-b2b5-4dec-b93b-a875c0a07930/kube-rbac-proxy/0.log" Oct 06 07:47:59 crc kubenswrapper[4845]: I1006 07:47:59.375973 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-xbpss_96184c2a-b2b5-4dec-b93b-a875c0a07930/controller/0.log" Oct 06 07:47:59 crc kubenswrapper[4845]: I1006 07:47:59.486766 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-frr-files/0.log" Oct 06 07:47:59 crc kubenswrapper[4845]: I1006 07:47:59.678368 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-metrics/0.log" Oct 06 07:47:59 crc kubenswrapper[4845]: I1006 07:47:59.678368 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-frr-files/0.log" Oct 06 07:47:59 crc kubenswrapper[4845]: I1006 07:47:59.723429 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-reloader/0.log" Oct 06 07:47:59 crc kubenswrapper[4845]: I1006 07:47:59.728841 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-reloader/0.log" Oct 06 07:47:59 crc kubenswrapper[4845]: I1006 07:47:59.872770 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-reloader/0.log" Oct 06 07:47:59 crc kubenswrapper[4845]: I1006 07:47:59.878013 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-metrics/0.log" Oct 06 07:47:59 crc kubenswrapper[4845]: I1006 07:47:59.879241 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-frr-files/0.log" Oct 06 07:47:59 crc kubenswrapper[4845]: I1006 07:47:59.921104 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-metrics/0.log" Oct 06 07:48:00 crc kubenswrapper[4845]: I1006 07:48:00.111704 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-frr-files/0.log" Oct 06 07:48:00 crc kubenswrapper[4845]: I1006 07:48:00.118829 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-metrics/0.log" Oct 06 07:48:00 crc kubenswrapper[4845]: I1006 07:48:00.134274 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/cp-reloader/0.log" Oct 06 07:48:00 crc kubenswrapper[4845]: I1006 07:48:00.137785 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/controller/0.log" Oct 06 07:48:00 crc kubenswrapper[4845]: I1006 07:48:00.302059 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/frr-metrics/0.log" Oct 06 07:48:00 crc kubenswrapper[4845]: I1006 07:48:00.355316 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/kube-rbac-proxy/0.log" Oct 06 07:48:00 crc kubenswrapper[4845]: I1006 07:48:00.380921 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/kube-rbac-proxy-frr/0.log" Oct 06 07:48:00 crc kubenswrapper[4845]: I1006 07:48:00.520624 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/reloader/0.log" Oct 06 07:48:00 crc kubenswrapper[4845]: I1006 07:48:00.868660 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-kctns_396607a8-6c19-447d-a4a2-7a8ef92d957a/frr-k8s-webhook-server/0.log" Oct 06 07:48:01 crc kubenswrapper[4845]: I1006 07:48:01.125400 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-687d5696cb-xscrv_5501cc41-c16f-423f-a782-96e0d186f44e/manager/0.log" Oct 06 07:48:01 crc kubenswrapper[4845]: I1006 07:48:01.214345 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7c77dc94b4-h9wms_f925eeb5-fe5d-4479-9c06-be3069abc88d/webhook-server/0.log" Oct 06 07:48:01 crc kubenswrapper[4845]: I1006 07:48:01.367248 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tsmdg_fcf3c696-ca9a-4b92-9314-dcc4edb86577/kube-rbac-proxy/0.log" Oct 06 07:48:01 crc kubenswrapper[4845]: I1006 07:48:01.689596 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dpj7m_ff6e65ed-e4d3-4fce-b1c9-87eb219c2924/frr/0.log" Oct 06 07:48:01 crc kubenswrapper[4845]: I1006 07:48:01.822063 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tsmdg_fcf3c696-ca9a-4b92-9314-dcc4edb86577/speaker/0.log" Oct 06 07:48:05 crc kubenswrapper[4845]: I1006 07:48:05.227530 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:48:05 crc kubenswrapper[4845]: E1006 07:48:05.228217 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:48:08 crc kubenswrapper[4845]: I1006 07:48:08.825183 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7t2qg"] Oct 06 07:48:08 crc kubenswrapper[4845]: E1006 07:48:08.826015 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a804f975-ee7e-4db7-ad85-80b9d6a06ee9" containerName="container-00" Oct 06 07:48:08 crc kubenswrapper[4845]: I1006 07:48:08.826033 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a804f975-ee7e-4db7-ad85-80b9d6a06ee9" containerName="container-00" Oct 06 07:48:08 crc kubenswrapper[4845]: I1006 07:48:08.826216 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a804f975-ee7e-4db7-ad85-80b9d6a06ee9" containerName="container-00" Oct 06 07:48:08 crc kubenswrapper[4845]: I1006 07:48:08.827506 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:08 crc kubenswrapper[4845]: I1006 07:48:08.892944 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7t2qg"] Oct 06 07:48:09 crc kubenswrapper[4845]: I1006 07:48:09.008036 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-catalog-content\") pod \"redhat-operators-7t2qg\" (UID: \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\") " pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:09 crc kubenswrapper[4845]: I1006 07:48:09.008118 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-utilities\") pod \"redhat-operators-7t2qg\" (UID: \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\") " pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:09 crc kubenswrapper[4845]: I1006 07:48:09.008187 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq7zr\" (UniqueName: \"kubernetes.io/projected/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-kube-api-access-vq7zr\") pod \"redhat-operators-7t2qg\" (UID: \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\") " pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:09 crc kubenswrapper[4845]: I1006 07:48:09.110771 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq7zr\" (UniqueName: \"kubernetes.io/projected/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-kube-api-access-vq7zr\") pod \"redhat-operators-7t2qg\" (UID: \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\") " pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:09 crc kubenswrapper[4845]: I1006 07:48:09.111546 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-catalog-content\") pod \"redhat-operators-7t2qg\" (UID: \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\") " pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:09 crc kubenswrapper[4845]: I1006 07:48:09.111735 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-utilities\") pod \"redhat-operators-7t2qg\" (UID: \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\") " pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:09 crc kubenswrapper[4845]: I1006 07:48:09.112004 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-catalog-content\") pod \"redhat-operators-7t2qg\" (UID: \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\") " pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:09 crc kubenswrapper[4845]: I1006 07:48:09.112279 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-utilities\") pod \"redhat-operators-7t2qg\" (UID: \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\") " pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:09 crc kubenswrapper[4845]: I1006 07:48:09.142631 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq7zr\" (UniqueName: \"kubernetes.io/projected/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-kube-api-access-vq7zr\") pod \"redhat-operators-7t2qg\" (UID: \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\") " pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:09 crc kubenswrapper[4845]: I1006 07:48:09.190977 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:09 crc kubenswrapper[4845]: I1006 07:48:09.694572 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7t2qg"] Oct 06 07:48:10 crc kubenswrapper[4845]: I1006 07:48:10.538320 4845 generic.go:334] "Generic (PLEG): container finished" podID="b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" containerID="8152ccfbccf6cf9848ce4bfd2ebd87e55f4c69390e5ff0cabd16f1fda2dbeb3b" exitCode=0 Oct 06 07:48:10 crc kubenswrapper[4845]: I1006 07:48:10.538424 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7t2qg" event={"ID":"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427","Type":"ContainerDied","Data":"8152ccfbccf6cf9848ce4bfd2ebd87e55f4c69390e5ff0cabd16f1fda2dbeb3b"} Oct 06 07:48:10 crc kubenswrapper[4845]: I1006 07:48:10.538678 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7t2qg" event={"ID":"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427","Type":"ContainerStarted","Data":"79012d62ad6d1e3bca511417c738be40f8fd0395820439dc38ef61153e4237a3"} Oct 06 07:48:10 crc kubenswrapper[4845]: I1006 07:48:10.540531 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 07:48:12 crc kubenswrapper[4845]: I1006 07:48:12.556318 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7t2qg" event={"ID":"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427","Type":"ContainerStarted","Data":"c00dae9d2281eca86966198d60f028d7edf2f110b3d5ec77cb60cd2ece7a470c"} Oct 06 07:48:13 crc kubenswrapper[4845]: I1006 07:48:13.090705 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/util/0.log" Oct 06 07:48:13 crc kubenswrapper[4845]: I1006 07:48:13.276665 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/util/0.log" Oct 06 07:48:13 crc kubenswrapper[4845]: I1006 07:48:13.338869 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/pull/0.log" Oct 06 07:48:13 crc kubenswrapper[4845]: I1006 07:48:13.443187 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/pull/0.log" Oct 06 07:48:13 crc kubenswrapper[4845]: I1006 07:48:13.545861 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/util/0.log" Oct 06 07:48:13 crc kubenswrapper[4845]: I1006 07:48:13.557937 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/pull/0.log" Oct 06 07:48:13 crc kubenswrapper[4845]: I1006 07:48:13.565234 4845 generic.go:334] "Generic (PLEG): container finished" podID="b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" containerID="c00dae9d2281eca86966198d60f028d7edf2f110b3d5ec77cb60cd2ece7a470c" exitCode=0 Oct 06 07:48:13 crc kubenswrapper[4845]: I1006 07:48:13.565452 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7t2qg" event={"ID":"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427","Type":"ContainerDied","Data":"c00dae9d2281eca86966198d60f028d7edf2f110b3d5ec77cb60cd2ece7a470c"} Oct 06 07:48:13 crc kubenswrapper[4845]: I1006 07:48:13.618805 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2hqj5d_602b8691-5aec-4f79-b690-a517191505b0/extract/0.log" Oct 06 07:48:13 crc kubenswrapper[4845]: I1006 07:48:13.732119 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/extract-utilities/0.log" Oct 06 07:48:13 crc kubenswrapper[4845]: I1006 07:48:13.951072 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/extract-utilities/0.log" Oct 06 07:48:13 crc kubenswrapper[4845]: I1006 07:48:13.993300 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/extract-content/0.log" Oct 06 07:48:14 crc kubenswrapper[4845]: I1006 07:48:14.013352 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/extract-content/0.log" Oct 06 07:48:14 crc kubenswrapper[4845]: I1006 07:48:14.186873 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/extract-utilities/0.log" Oct 06 07:48:14 crc kubenswrapper[4845]: I1006 07:48:14.186974 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/extract-content/0.log" Oct 06 07:48:14 crc kubenswrapper[4845]: I1006 07:48:14.367795 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/extract-utilities/0.log" Oct 06 07:48:14 crc kubenswrapper[4845]: I1006 07:48:14.574570 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7t2qg" event={"ID":"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427","Type":"ContainerStarted","Data":"b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449"} Oct 06 07:48:14 crc kubenswrapper[4845]: I1006 07:48:14.601060 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7t2qg" podStartSLOduration=3.138533708 podStartE2EDuration="6.601041387s" podCreationTimestamp="2025-10-06 07:48:08 +0000 UTC" firstStartedPulling="2025-10-06 07:48:10.540290677 +0000 UTC m=+3775.055031685" lastFinishedPulling="2025-10-06 07:48:14.002798356 +0000 UTC m=+3778.517539364" observedRunningTime="2025-10-06 07:48:14.590279464 +0000 UTC m=+3779.105020462" watchObservedRunningTime="2025-10-06 07:48:14.601041387 +0000 UTC m=+3779.115782395" Oct 06 07:48:14 crc kubenswrapper[4845]: I1006 07:48:14.656834 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m7kmk_ccce060c-c044-45d8-8d3c-92cc9e40198a/registry-server/0.log" Oct 06 07:48:14 crc kubenswrapper[4845]: I1006 07:48:14.938878 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/extract-content/0.log" Oct 06 07:48:14 crc kubenswrapper[4845]: I1006 07:48:14.953007 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/extract-utilities/0.log" Oct 06 07:48:14 crc kubenswrapper[4845]: I1006 07:48:14.961608 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/extract-content/0.log" Oct 06 07:48:15 crc kubenswrapper[4845]: I1006 07:48:15.176406 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/extract-utilities/0.log" Oct 06 07:48:15 crc kubenswrapper[4845]: I1006 07:48:15.192944 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/extract-content/0.log" Oct 06 07:48:15 crc kubenswrapper[4845]: I1006 07:48:15.429457 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/util/0.log" Oct 06 07:48:15 crc kubenswrapper[4845]: I1006 07:48:15.437182 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2smdl_fada0329-244a-449f-bb74-2ee74eaba095/registry-server/0.log" Oct 06 07:48:15 crc kubenswrapper[4845]: I1006 07:48:15.620476 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/util/0.log" Oct 06 07:48:15 crc kubenswrapper[4845]: I1006 07:48:15.645679 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/pull/0.log" Oct 06 07:48:15 crc kubenswrapper[4845]: I1006 07:48:15.687921 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/pull/0.log" Oct 06 07:48:15 crc kubenswrapper[4845]: I1006 07:48:15.899720 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/extract/0.log" Oct 06 07:48:15 crc kubenswrapper[4845]: I1006 07:48:15.929051 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/util/0.log" Oct 06 07:48:15 crc kubenswrapper[4845]: I1006 07:48:15.947111 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7jvbm_e70f5fe4-9f73-4919-87c7-2732d365bdd0/pull/0.log" Oct 06 07:48:16 crc kubenswrapper[4845]: I1006 07:48:16.094893 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zxglv_b5f2f19d-7dbf-4265-8e4c-96739b00f6e2/marketplace-operator/0.log" Oct 06 07:48:16 crc kubenswrapper[4845]: I1006 07:48:16.178661 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/extract-utilities/0.log" Oct 06 07:48:16 crc kubenswrapper[4845]: I1006 07:48:16.955205 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/extract-content/0.log" Oct 06 07:48:16 crc kubenswrapper[4845]: I1006 07:48:16.961825 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/extract-utilities/0.log" Oct 06 07:48:16 crc kubenswrapper[4845]: I1006 07:48:16.995245 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/extract-content/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.152321 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/extract-utilities/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.247667 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/extract-content/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.286412 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ctv9g_c769aa66-5169-4ec2-8993-540bd8bfcfca/registry-server/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.304958 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7t2qg_b4090ae9-a6e8-4c81-92cc-0ab97d5f7427/extract-utilities/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.471691 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7t2qg_b4090ae9-a6e8-4c81-92cc-0ab97d5f7427/extract-content/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.479179 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7t2qg_b4090ae9-a6e8-4c81-92cc-0ab97d5f7427/extract-utilities/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.503412 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7t2qg_b4090ae9-a6e8-4c81-92cc-0ab97d5f7427/extract-content/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.645404 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7t2qg_b4090ae9-a6e8-4c81-92cc-0ab97d5f7427/extract-content/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.701667 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7t2qg_b4090ae9-a6e8-4c81-92cc-0ab97d5f7427/extract-utilities/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.713025 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7t2qg_b4090ae9-a6e8-4c81-92cc-0ab97d5f7427/registry-server/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.734116 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/extract-utilities/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.947519 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/extract-utilities/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.967858 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/extract-content/0.log" Oct 06 07:48:17 crc kubenswrapper[4845]: I1006 07:48:17.974299 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/extract-content/0.log" Oct 06 07:48:18 crc kubenswrapper[4845]: I1006 07:48:18.200216 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/extract-utilities/0.log" Oct 06 07:48:18 crc kubenswrapper[4845]: I1006 07:48:18.213900 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/extract-content/0.log" Oct 06 07:48:18 crc kubenswrapper[4845]: I1006 07:48:18.617657 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-msb4z_738fc958-3a60-4780-aa02-8af7f6887aa6/registry-server/0.log" Oct 06 07:48:19 crc kubenswrapper[4845]: I1006 07:48:19.191209 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:19 crc kubenswrapper[4845]: I1006 07:48:19.191251 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:19 crc kubenswrapper[4845]: I1006 07:48:19.241860 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:19 crc kubenswrapper[4845]: I1006 07:48:19.660927 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:19 crc kubenswrapper[4845]: I1006 07:48:19.706642 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7t2qg"] Oct 06 07:48:20 crc kubenswrapper[4845]: I1006 07:48:20.227140 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:48:20 crc kubenswrapper[4845]: E1006 07:48:20.228386 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:48:21 crc kubenswrapper[4845]: I1006 07:48:21.627750 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7t2qg" podUID="b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" containerName="registry-server" containerID="cri-o://b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449" gracePeriod=2 Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.071181 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.164820 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-catalog-content\") pod \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\" (UID: \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\") " Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.165271 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq7zr\" (UniqueName: \"kubernetes.io/projected/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-kube-api-access-vq7zr\") pod \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\" (UID: \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\") " Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.165349 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-utilities\") pod \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\" (UID: \"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427\") " Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.166179 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-utilities" (OuterVolumeSpecName: "utilities") pod "b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" (UID: "b4090ae9-a6e8-4c81-92cc-0ab97d5f7427"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.171063 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-kube-api-access-vq7zr" (OuterVolumeSpecName: "kube-api-access-vq7zr") pod "b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" (UID: "b4090ae9-a6e8-4c81-92cc-0ab97d5f7427"). InnerVolumeSpecName "kube-api-access-vq7zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.263040 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" (UID: "b4090ae9-a6e8-4c81-92cc-0ab97d5f7427"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.267524 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq7zr\" (UniqueName: \"kubernetes.io/projected/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-kube-api-access-vq7zr\") on node \"crc\" DevicePath \"\"" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.267565 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.267577 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.641486 4845 generic.go:334] "Generic (PLEG): container finished" podID="b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" containerID="b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449" exitCode=0 Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.641546 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7t2qg" event={"ID":"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427","Type":"ContainerDied","Data":"b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449"} Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.641589 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7t2qg" event={"ID":"b4090ae9-a6e8-4c81-92cc-0ab97d5f7427","Type":"ContainerDied","Data":"79012d62ad6d1e3bca511417c738be40f8fd0395820439dc38ef61153e4237a3"} Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.641617 4845 scope.go:117] "RemoveContainer" containerID="b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.641551 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7t2qg" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.660945 4845 scope.go:117] "RemoveContainer" containerID="c00dae9d2281eca86966198d60f028d7edf2f110b3d5ec77cb60cd2ece7a470c" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.672664 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7t2qg"] Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.679586 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7t2qg"] Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.684828 4845 scope.go:117] "RemoveContainer" containerID="8152ccfbccf6cf9848ce4bfd2ebd87e55f4c69390e5ff0cabd16f1fda2dbeb3b" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.725057 4845 scope.go:117] "RemoveContainer" containerID="b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449" Oct 06 07:48:22 crc kubenswrapper[4845]: E1006 07:48:22.725714 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449\": container with ID starting with b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449 not found: ID does not exist" containerID="b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.725761 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449"} err="failed to get container status \"b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449\": rpc error: code = NotFound desc = could not find container \"b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449\": container with ID starting with b849c8baf876a830de63ba879e2b417a7f59d87385b9447e51eb6d58ef18b449 not found: ID does not exist" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.725790 4845 scope.go:117] "RemoveContainer" containerID="c00dae9d2281eca86966198d60f028d7edf2f110b3d5ec77cb60cd2ece7a470c" Oct 06 07:48:22 crc kubenswrapper[4845]: E1006 07:48:22.726468 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c00dae9d2281eca86966198d60f028d7edf2f110b3d5ec77cb60cd2ece7a470c\": container with ID starting with c00dae9d2281eca86966198d60f028d7edf2f110b3d5ec77cb60cd2ece7a470c not found: ID does not exist" containerID="c00dae9d2281eca86966198d60f028d7edf2f110b3d5ec77cb60cd2ece7a470c" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.726494 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c00dae9d2281eca86966198d60f028d7edf2f110b3d5ec77cb60cd2ece7a470c"} err="failed to get container status \"c00dae9d2281eca86966198d60f028d7edf2f110b3d5ec77cb60cd2ece7a470c\": rpc error: code = NotFound desc = could not find container \"c00dae9d2281eca86966198d60f028d7edf2f110b3d5ec77cb60cd2ece7a470c\": container with ID starting with c00dae9d2281eca86966198d60f028d7edf2f110b3d5ec77cb60cd2ece7a470c not found: ID does not exist" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.726509 4845 scope.go:117] "RemoveContainer" containerID="8152ccfbccf6cf9848ce4bfd2ebd87e55f4c69390e5ff0cabd16f1fda2dbeb3b" Oct 06 07:48:22 crc kubenswrapper[4845]: E1006 07:48:22.726774 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8152ccfbccf6cf9848ce4bfd2ebd87e55f4c69390e5ff0cabd16f1fda2dbeb3b\": container with ID starting with 8152ccfbccf6cf9848ce4bfd2ebd87e55f4c69390e5ff0cabd16f1fda2dbeb3b not found: ID does not exist" containerID="8152ccfbccf6cf9848ce4bfd2ebd87e55f4c69390e5ff0cabd16f1fda2dbeb3b" Oct 06 07:48:22 crc kubenswrapper[4845]: I1006 07:48:22.726791 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8152ccfbccf6cf9848ce4bfd2ebd87e55f4c69390e5ff0cabd16f1fda2dbeb3b"} err="failed to get container status \"8152ccfbccf6cf9848ce4bfd2ebd87e55f4c69390e5ff0cabd16f1fda2dbeb3b\": rpc error: code = NotFound desc = could not find container \"8152ccfbccf6cf9848ce4bfd2ebd87e55f4c69390e5ff0cabd16f1fda2dbeb3b\": container with ID starting with 8152ccfbccf6cf9848ce4bfd2ebd87e55f4c69390e5ff0cabd16f1fda2dbeb3b not found: ID does not exist" Oct 06 07:48:24 crc kubenswrapper[4845]: I1006 07:48:24.239308 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" path="/var/lib/kubelet/pods/b4090ae9-a6e8-4c81-92cc-0ab97d5f7427/volumes" Oct 06 07:48:33 crc kubenswrapper[4845]: I1006 07:48:33.227272 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:48:33 crc kubenswrapper[4845]: E1006 07:48:33.228145 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:48:46 crc kubenswrapper[4845]: I1006 07:48:46.232257 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:48:46 crc kubenswrapper[4845]: E1006 07:48:46.233209 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:49:01 crc kubenswrapper[4845]: I1006 07:49:01.227801 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:49:01 crc kubenswrapper[4845]: E1006 07:49:01.229112 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:49:15 crc kubenswrapper[4845]: I1006 07:49:15.227411 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:49:15 crc kubenswrapper[4845]: E1006 07:49:15.228510 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:49:30 crc kubenswrapper[4845]: I1006 07:49:30.230101 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:49:30 crc kubenswrapper[4845]: E1006 07:49:30.230829 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:49:45 crc kubenswrapper[4845]: I1006 07:49:45.227029 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:49:45 crc kubenswrapper[4845]: E1006 07:49:45.227965 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tpgm6_openshift-machine-config-operator(6936952c-09f0-48fd-8832-38c18202ae81)\"" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" podUID="6936952c-09f0-48fd-8832-38c18202ae81" Oct 06 07:49:45 crc kubenswrapper[4845]: I1006 07:49:45.921350 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bkbsh"] Oct 06 07:49:45 crc kubenswrapper[4845]: E1006 07:49:45.922485 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" containerName="extract-content" Oct 06 07:49:45 crc kubenswrapper[4845]: I1006 07:49:45.922593 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" containerName="extract-content" Oct 06 07:49:45 crc kubenswrapper[4845]: E1006 07:49:45.922693 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" containerName="extract-utilities" Oct 06 07:49:45 crc kubenswrapper[4845]: I1006 07:49:45.922811 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" containerName="extract-utilities" Oct 06 07:49:45 crc kubenswrapper[4845]: E1006 07:49:45.923045 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" containerName="registry-server" Oct 06 07:49:45 crc kubenswrapper[4845]: I1006 07:49:45.923129 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" containerName="registry-server" Oct 06 07:49:45 crc kubenswrapper[4845]: I1006 07:49:45.923493 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4090ae9-a6e8-4c81-92cc-0ab97d5f7427" containerName="registry-server" Oct 06 07:49:45 crc kubenswrapper[4845]: I1006 07:49:45.925650 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:45 crc kubenswrapper[4845]: I1006 07:49:45.938349 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkbsh"] Oct 06 07:49:46 crc kubenswrapper[4845]: I1006 07:49:46.005280 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759920b3-aa9d-4900-8448-649f5d773ae5-utilities\") pod \"redhat-marketplace-bkbsh\" (UID: \"759920b3-aa9d-4900-8448-649f5d773ae5\") " pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:46 crc kubenswrapper[4845]: I1006 07:49:46.005347 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk59d\" (UniqueName: \"kubernetes.io/projected/759920b3-aa9d-4900-8448-649f5d773ae5-kube-api-access-mk59d\") pod \"redhat-marketplace-bkbsh\" (UID: \"759920b3-aa9d-4900-8448-649f5d773ae5\") " pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:46 crc kubenswrapper[4845]: I1006 07:49:46.005391 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759920b3-aa9d-4900-8448-649f5d773ae5-catalog-content\") pod \"redhat-marketplace-bkbsh\" (UID: \"759920b3-aa9d-4900-8448-649f5d773ae5\") " pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:46 crc kubenswrapper[4845]: I1006 07:49:46.106682 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759920b3-aa9d-4900-8448-649f5d773ae5-utilities\") pod \"redhat-marketplace-bkbsh\" (UID: \"759920b3-aa9d-4900-8448-649f5d773ae5\") " pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:46 crc kubenswrapper[4845]: I1006 07:49:46.106730 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk59d\" (UniqueName: \"kubernetes.io/projected/759920b3-aa9d-4900-8448-649f5d773ae5-kube-api-access-mk59d\") pod \"redhat-marketplace-bkbsh\" (UID: \"759920b3-aa9d-4900-8448-649f5d773ae5\") " pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:46 crc kubenswrapper[4845]: I1006 07:49:46.106751 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759920b3-aa9d-4900-8448-649f5d773ae5-catalog-content\") pod \"redhat-marketplace-bkbsh\" (UID: \"759920b3-aa9d-4900-8448-649f5d773ae5\") " pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:46 crc kubenswrapper[4845]: I1006 07:49:46.107184 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759920b3-aa9d-4900-8448-649f5d773ae5-utilities\") pod \"redhat-marketplace-bkbsh\" (UID: \"759920b3-aa9d-4900-8448-649f5d773ae5\") " pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:46 crc kubenswrapper[4845]: I1006 07:49:46.107224 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759920b3-aa9d-4900-8448-649f5d773ae5-catalog-content\") pod \"redhat-marketplace-bkbsh\" (UID: \"759920b3-aa9d-4900-8448-649f5d773ae5\") " pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:46 crc kubenswrapper[4845]: I1006 07:49:46.130515 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk59d\" (UniqueName: \"kubernetes.io/projected/759920b3-aa9d-4900-8448-649f5d773ae5-kube-api-access-mk59d\") pod \"redhat-marketplace-bkbsh\" (UID: \"759920b3-aa9d-4900-8448-649f5d773ae5\") " pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:46 crc kubenswrapper[4845]: I1006 07:49:46.269418 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:46 crc kubenswrapper[4845]: I1006 07:49:46.691284 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkbsh"] Oct 06 07:49:47 crc kubenswrapper[4845]: I1006 07:49:47.443487 4845 generic.go:334] "Generic (PLEG): container finished" podID="759920b3-aa9d-4900-8448-649f5d773ae5" containerID="5366a6fe25a0da9e9f86c4e2f3f9a181bb22a636befdc7652c13f55c18133d03" exitCode=0 Oct 06 07:49:47 crc kubenswrapper[4845]: I1006 07:49:47.443549 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbsh" event={"ID":"759920b3-aa9d-4900-8448-649f5d773ae5","Type":"ContainerDied","Data":"5366a6fe25a0da9e9f86c4e2f3f9a181bb22a636befdc7652c13f55c18133d03"} Oct 06 07:49:47 crc kubenswrapper[4845]: I1006 07:49:47.443779 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbsh" event={"ID":"759920b3-aa9d-4900-8448-649f5d773ae5","Type":"ContainerStarted","Data":"edd7e6adfe459a32161075f71adefd5e19dc6c621efb75ce580ed2b1192ce5ce"} Oct 06 07:49:48 crc kubenswrapper[4845]: I1006 07:49:48.454015 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbsh" event={"ID":"759920b3-aa9d-4900-8448-649f5d773ae5","Type":"ContainerStarted","Data":"1de28caf348da9a1c04e62b89485e169729ff88e499c80522e01080eeb7ecd80"} Oct 06 07:49:49 crc kubenswrapper[4845]: I1006 07:49:49.466566 4845 generic.go:334] "Generic (PLEG): container finished" podID="759920b3-aa9d-4900-8448-649f5d773ae5" containerID="1de28caf348da9a1c04e62b89485e169729ff88e499c80522e01080eeb7ecd80" exitCode=0 Oct 06 07:49:49 crc kubenswrapper[4845]: I1006 07:49:49.466866 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbsh" event={"ID":"759920b3-aa9d-4900-8448-649f5d773ae5","Type":"ContainerDied","Data":"1de28caf348da9a1c04e62b89485e169729ff88e499c80522e01080eeb7ecd80"} Oct 06 07:49:50 crc kubenswrapper[4845]: I1006 07:49:50.476323 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbsh" event={"ID":"759920b3-aa9d-4900-8448-649f5d773ae5","Type":"ContainerStarted","Data":"5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea"} Oct 06 07:49:50 crc kubenswrapper[4845]: I1006 07:49:50.495663 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bkbsh" podStartSLOduration=3.076649248 podStartE2EDuration="5.495640283s" podCreationTimestamp="2025-10-06 07:49:45 +0000 UTC" firstStartedPulling="2025-10-06 07:49:47.445305356 +0000 UTC m=+3871.960046364" lastFinishedPulling="2025-10-06 07:49:49.864296381 +0000 UTC m=+3874.379037399" observedRunningTime="2025-10-06 07:49:50.493066437 +0000 UTC m=+3875.007807465" watchObservedRunningTime="2025-10-06 07:49:50.495640283 +0000 UTC m=+3875.010381311" Oct 06 07:49:56 crc kubenswrapper[4845]: I1006 07:49:56.233393 4845 scope.go:117] "RemoveContainer" containerID="b61782710957c5f86520690932c647ec2ab13e382d3c1edbab5c89fada36349d" Oct 06 07:49:56 crc kubenswrapper[4845]: I1006 07:49:56.269601 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:56 crc kubenswrapper[4845]: I1006 07:49:56.269981 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:56 crc kubenswrapper[4845]: I1006 07:49:56.366554 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:56 crc kubenswrapper[4845]: I1006 07:49:56.534471 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tpgm6" event={"ID":"6936952c-09f0-48fd-8832-38c18202ae81","Type":"ContainerStarted","Data":"02ce192b21ea0c6be69bdabdfdf78f58c758f047fff805434ca44f89779cd187"} Oct 06 07:49:56 crc kubenswrapper[4845]: I1006 07:49:56.589615 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:56 crc kubenswrapper[4845]: I1006 07:49:56.639569 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkbsh"] Oct 06 07:49:58 crc kubenswrapper[4845]: I1006 07:49:58.551107 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bkbsh" podUID="759920b3-aa9d-4900-8448-649f5d773ae5" containerName="registry-server" containerID="cri-o://5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea" gracePeriod=2 Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.023907 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.167107 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk59d\" (UniqueName: \"kubernetes.io/projected/759920b3-aa9d-4900-8448-649f5d773ae5-kube-api-access-mk59d\") pod \"759920b3-aa9d-4900-8448-649f5d773ae5\" (UID: \"759920b3-aa9d-4900-8448-649f5d773ae5\") " Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.167837 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759920b3-aa9d-4900-8448-649f5d773ae5-utilities\") pod \"759920b3-aa9d-4900-8448-649f5d773ae5\" (UID: \"759920b3-aa9d-4900-8448-649f5d773ae5\") " Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.168570 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759920b3-aa9d-4900-8448-649f5d773ae5-catalog-content\") pod \"759920b3-aa9d-4900-8448-649f5d773ae5\" (UID: \"759920b3-aa9d-4900-8448-649f5d773ae5\") " Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.168500 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759920b3-aa9d-4900-8448-649f5d773ae5-utilities" (OuterVolumeSpecName: "utilities") pod "759920b3-aa9d-4900-8448-649f5d773ae5" (UID: "759920b3-aa9d-4900-8448-649f5d773ae5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.172566 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/759920b3-aa9d-4900-8448-649f5d773ae5-kube-api-access-mk59d" (OuterVolumeSpecName: "kube-api-access-mk59d") pod "759920b3-aa9d-4900-8448-649f5d773ae5" (UID: "759920b3-aa9d-4900-8448-649f5d773ae5"). InnerVolumeSpecName "kube-api-access-mk59d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.182515 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759920b3-aa9d-4900-8448-649f5d773ae5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "759920b3-aa9d-4900-8448-649f5d773ae5" (UID: "759920b3-aa9d-4900-8448-649f5d773ae5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.184057 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759920b3-aa9d-4900-8448-649f5d773ae5-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.184141 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759920b3-aa9d-4900-8448-649f5d773ae5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.184226 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk59d\" (UniqueName: \"kubernetes.io/projected/759920b3-aa9d-4900-8448-649f5d773ae5-kube-api-access-mk59d\") on node \"crc\" DevicePath \"\"" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.560203 4845 generic.go:334] "Generic (PLEG): container finished" podID="759920b3-aa9d-4900-8448-649f5d773ae5" containerID="5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea" exitCode=0 Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.560274 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkbsh" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.560282 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbsh" event={"ID":"759920b3-aa9d-4900-8448-649f5d773ae5","Type":"ContainerDied","Data":"5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea"} Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.561311 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbsh" event={"ID":"759920b3-aa9d-4900-8448-649f5d773ae5","Type":"ContainerDied","Data":"edd7e6adfe459a32161075f71adefd5e19dc6c621efb75ce580ed2b1192ce5ce"} Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.561422 4845 scope.go:117] "RemoveContainer" containerID="5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.583429 4845 scope.go:117] "RemoveContainer" containerID="1de28caf348da9a1c04e62b89485e169729ff88e499c80522e01080eeb7ecd80" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.601438 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkbsh"] Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.607399 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkbsh"] Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.617931 4845 scope.go:117] "RemoveContainer" containerID="5366a6fe25a0da9e9f86c4e2f3f9a181bb22a636befdc7652c13f55c18133d03" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.653340 4845 scope.go:117] "RemoveContainer" containerID="5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea" Oct 06 07:49:59 crc kubenswrapper[4845]: E1006 07:49:59.653932 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea\": container with ID starting with 5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea not found: ID does not exist" containerID="5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.653963 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea"} err="failed to get container status \"5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea\": rpc error: code = NotFound desc = could not find container \"5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea\": container with ID starting with 5762e3eaa0cac1e246fe0b42dfb79d0cf9c814db57f097fa85240ab1fb539eea not found: ID does not exist" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.653987 4845 scope.go:117] "RemoveContainer" containerID="1de28caf348da9a1c04e62b89485e169729ff88e499c80522e01080eeb7ecd80" Oct 06 07:49:59 crc kubenswrapper[4845]: E1006 07:49:59.654282 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de28caf348da9a1c04e62b89485e169729ff88e499c80522e01080eeb7ecd80\": container with ID starting with 1de28caf348da9a1c04e62b89485e169729ff88e499c80522e01080eeb7ecd80 not found: ID does not exist" containerID="1de28caf348da9a1c04e62b89485e169729ff88e499c80522e01080eeb7ecd80" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.654300 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de28caf348da9a1c04e62b89485e169729ff88e499c80522e01080eeb7ecd80"} err="failed to get container status \"1de28caf348da9a1c04e62b89485e169729ff88e499c80522e01080eeb7ecd80\": rpc error: code = NotFound desc = could not find container \"1de28caf348da9a1c04e62b89485e169729ff88e499c80522e01080eeb7ecd80\": container with ID starting with 1de28caf348da9a1c04e62b89485e169729ff88e499c80522e01080eeb7ecd80 not found: ID does not exist" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.654311 4845 scope.go:117] "RemoveContainer" containerID="5366a6fe25a0da9e9f86c4e2f3f9a181bb22a636befdc7652c13f55c18133d03" Oct 06 07:49:59 crc kubenswrapper[4845]: E1006 07:49:59.654569 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5366a6fe25a0da9e9f86c4e2f3f9a181bb22a636befdc7652c13f55c18133d03\": container with ID starting with 5366a6fe25a0da9e9f86c4e2f3f9a181bb22a636befdc7652c13f55c18133d03 not found: ID does not exist" containerID="5366a6fe25a0da9e9f86c4e2f3f9a181bb22a636befdc7652c13f55c18133d03" Oct 06 07:49:59 crc kubenswrapper[4845]: I1006 07:49:59.654654 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5366a6fe25a0da9e9f86c4e2f3f9a181bb22a636befdc7652c13f55c18133d03"} err="failed to get container status \"5366a6fe25a0da9e9f86c4e2f3f9a181bb22a636befdc7652c13f55c18133d03\": rpc error: code = NotFound desc = could not find container \"5366a6fe25a0da9e9f86c4e2f3f9a181bb22a636befdc7652c13f55c18133d03\": container with ID starting with 5366a6fe25a0da9e9f86c4e2f3f9a181bb22a636befdc7652c13f55c18133d03 not found: ID does not exist" Oct 06 07:50:00 crc kubenswrapper[4845]: I1006 07:50:00.238132 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="759920b3-aa9d-4900-8448-649f5d773ae5" path="/var/lib/kubelet/pods/759920b3-aa9d-4900-8448-649f5d773ae5/volumes" Oct 06 07:50:09 crc kubenswrapper[4845]: I1006 07:50:09.653169 4845 generic.go:334] "Generic (PLEG): container finished" podID="1923757d-527d-4867-9c13-f732f7e10077" containerID="abf4c7857653e73e2d768be14bd29ed96776fe748fe9c486ece007b3c022d904" exitCode=0 Oct 06 07:50:09 crc kubenswrapper[4845]: I1006 07:50:09.653299 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rz8wr/must-gather-sft27" event={"ID":"1923757d-527d-4867-9c13-f732f7e10077","Type":"ContainerDied","Data":"abf4c7857653e73e2d768be14bd29ed96776fe748fe9c486ece007b3c022d904"} Oct 06 07:50:09 crc kubenswrapper[4845]: I1006 07:50:09.654313 4845 scope.go:117] "RemoveContainer" containerID="abf4c7857653e73e2d768be14bd29ed96776fe748fe9c486ece007b3c022d904" Oct 06 07:50:10 crc kubenswrapper[4845]: I1006 07:50:10.654432 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rz8wr_must-gather-sft27_1923757d-527d-4867-9c13-f732f7e10077/gather/0.log" Oct 06 07:50:21 crc kubenswrapper[4845]: I1006 07:50:21.423098 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rz8wr/must-gather-sft27"] Oct 06 07:50:21 crc kubenswrapper[4845]: I1006 07:50:21.423927 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rz8wr/must-gather-sft27" podUID="1923757d-527d-4867-9c13-f732f7e10077" containerName="copy" containerID="cri-o://55d343e729998928b9db295c1c683c86831a6c1d51d162e5de588bcc739d5a41" gracePeriod=2 Oct 06 07:50:21 crc kubenswrapper[4845]: I1006 07:50:21.437225 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rz8wr/must-gather-sft27"] Oct 06 07:50:21 crc kubenswrapper[4845]: I1006 07:50:21.774292 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rz8wr_must-gather-sft27_1923757d-527d-4867-9c13-f732f7e10077/copy/0.log" Oct 06 07:50:21 crc kubenswrapper[4845]: I1006 07:50:21.775582 4845 generic.go:334] "Generic (PLEG): container finished" podID="1923757d-527d-4867-9c13-f732f7e10077" containerID="55d343e729998928b9db295c1c683c86831a6c1d51d162e5de588bcc739d5a41" exitCode=143 Oct 06 07:50:21 crc kubenswrapper[4845]: I1006 07:50:21.775636 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50640948d237f86ae29dd2fc9b39c564e1f1c4b97c1d1419d13613b0301646f7" Oct 06 07:50:21 crc kubenswrapper[4845]: I1006 07:50:21.849897 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rz8wr_must-gather-sft27_1923757d-527d-4867-9c13-f732f7e10077/copy/0.log" Oct 06 07:50:21 crc kubenswrapper[4845]: I1006 07:50:21.850933 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/must-gather-sft27" Oct 06 07:50:22 crc kubenswrapper[4845]: I1006 07:50:22.004366 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzpbh\" (UniqueName: \"kubernetes.io/projected/1923757d-527d-4867-9c13-f732f7e10077-kube-api-access-pzpbh\") pod \"1923757d-527d-4867-9c13-f732f7e10077\" (UID: \"1923757d-527d-4867-9c13-f732f7e10077\") " Oct 06 07:50:22 crc kubenswrapper[4845]: I1006 07:50:22.004583 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1923757d-527d-4867-9c13-f732f7e10077-must-gather-output\") pod \"1923757d-527d-4867-9c13-f732f7e10077\" (UID: \"1923757d-527d-4867-9c13-f732f7e10077\") " Oct 06 07:50:22 crc kubenswrapper[4845]: I1006 07:50:22.011956 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1923757d-527d-4867-9c13-f732f7e10077-kube-api-access-pzpbh" (OuterVolumeSpecName: "kube-api-access-pzpbh") pod "1923757d-527d-4867-9c13-f732f7e10077" (UID: "1923757d-527d-4867-9c13-f732f7e10077"). InnerVolumeSpecName "kube-api-access-pzpbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 07:50:22 crc kubenswrapper[4845]: I1006 07:50:22.107084 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzpbh\" (UniqueName: \"kubernetes.io/projected/1923757d-527d-4867-9c13-f732f7e10077-kube-api-access-pzpbh\") on node \"crc\" DevicePath \"\"" Oct 06 07:50:22 crc kubenswrapper[4845]: I1006 07:50:22.153678 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1923757d-527d-4867-9c13-f732f7e10077-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1923757d-527d-4867-9c13-f732f7e10077" (UID: "1923757d-527d-4867-9c13-f732f7e10077"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 07:50:22 crc kubenswrapper[4845]: I1006 07:50:22.209846 4845 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1923757d-527d-4867-9c13-f732f7e10077-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 06 07:50:22 crc kubenswrapper[4845]: I1006 07:50:22.237275 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1923757d-527d-4867-9c13-f732f7e10077" path="/var/lib/kubelet/pods/1923757d-527d-4867-9c13-f732f7e10077/volumes" Oct 06 07:50:22 crc kubenswrapper[4845]: I1006 07:50:22.782750 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rz8wr/must-gather-sft27" Oct 06 07:51:25 crc kubenswrapper[4845]: I1006 07:51:25.546607 4845 scope.go:117] "RemoveContainer" containerID="b1844b1379bca977f46bfd4e12af99ab78e47adc55d25ae59031cb2425ad4e7c" Oct 06 07:51:25 crc kubenswrapper[4845]: I1006 07:51:25.567236 4845 scope.go:117] "RemoveContainer" containerID="abf4c7857653e73e2d768be14bd29ed96776fe748fe9c486ece007b3c022d904" Oct 06 07:51:25 crc kubenswrapper[4845]: I1006 07:51:25.647420 4845 scope.go:117] "RemoveContainer" containerID="55d343e729998928b9db295c1c683c86831a6c1d51d162e5de588bcc739d5a41" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515070672422024452 0ustar coreroot‹íÁ  ÷Om7 €7šÞ'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015070672423017370 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015070662274016516 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015070662274015466 5ustar corecore